What are the responsibilities and job description for the Data Lead (ETL) position at Photon?
Job Details
Description for Internal Candidates |
Overview:
We are looking for a seasoned Data Lead with expertise in ETL (Extract, Transform, Load) processes to lead data-driven projects and initiatives. The ideal candidate will have a strong background in Snowflake, SQL, and Python for efficient data management, transformation, and analysis. You will play a pivotal role in overseeing the design and development of scalable data solutions, ensuring optimal performance and alignment with business needs.
Key Responsibilities:
- Lead the design, development, and maintenance of ETL pipelines to support data transformation and integration across various systems.
- Architect, implement, and optimize data solutions on the Snowflake platform, ensuring scalability, performance, and reliability.
- Collaborate with stakeholders to understand business requirements, ensuring the ETL processes align with organizational data needs.
- Develop and maintain complex SQL queries to extract, manipulate, and analyze data from multiple sources.
- Build and automate data workflows using Python to enhance data transformation and integration tasks.
- Monitor data quality, consistency, and integrity, implementing error-handling and performance-tuning techniques where necessary.
- Provide technical leadership and mentorship to junior data engineers and analysts, promoting best practices in ETL and data management.
- Ensure data governance and security compliance throughout all data processes.
- Continuously evaluate emerging tools and technologies to enhance the ETL process and data platform efficiency.
Required Skills and Qualifications:
- 6 years of experience in ETL design, development, and implementation.
- Proficiency in Snowflake for data warehousing and advanced analytics.
- Strong knowledge of SQL (including query optimization) and working with relational databases.
- Expertise in Python for building, automating, and optimizing data workflows.
- Hands-on experience with data integration tools and frameworks (e.g., Informatica, Talend, Airflow, etc.).
- Familiarity with data modeling, data lakes, and cloud platforms like AWS, Azure, or Google Cloud Platform.
- Experience with data governance and security best practices.
- Excellent problem-solving and analytical skills, with a keen attention to detail.
- Strong communication and collaboration skills, capable of working with cross-functional teams.
Preferred Qualifications:
- Experience with machine learning or advanced analytics.
- Certifications in Snowflake, AWS, or Azure.
- Knowledge of CI/CD pipelines and automation tools for data deployments.