What are the responsibilities and job description for the Lead Snowflake with Python position at Centraprise?
Snowflake Lead with Python Experience
Chicago, IL (Onsite)
Contract
Job Description:
We are seeking a highly skilled and experienced Snowflake Lead with a strong background in Python programming to lead our data engineering efforts. The ideal candidate will have a deep understanding of Snowflake architecture and data modeling, as well as expertise in developing and maintaining data pipelines using Python. This role requires both technical expertise and leadership skills, as you will be responsible for guiding a team and collaborating with cross-functional teams to implement efficient and scalable data solutions.
Key Responsibilities:
Lead the design, development, and optimization of Snowflake-based data architectures and data pipelines.
Oversee the migration of on-premises or legacy data systems to Snowflake.
Collaborate with stakeholders to gather and understand data requirements and translate them into Snowflake solutions.
Utilize Python to develop ETL scripts, data transformation workflows, and automation processes.
Manage Snowflake environments and ensure best practices in performance tuning, cost optimization, and security.
Work closely with data scientists, business analysts, and other teams to ensure data accuracy and quality.
Troubleshoot and resolve issues related to data integration and performance within the Snowflake platform.
Mentor and guide junior team members, promoting continuous learning and development.
Continuously evaluate and improve data management practices, contributing to innovation and operational excellence.
Required Skills & Experience:
Snowflake: Expertise in Snowflake architecture, data warehousing concepts, Snowflake schema design, and performance optimization.
Python: Strong proficiency in Python for data engineering tasks, including ETL pipeline development, data manipulation, and integration with Snowflake.
Experience working with large-scale datasets and optimizing data processing workflows.
Familiarity with cloud platforms (AWS, Azure, Google Cloud) and integration with Snowflake.
Knowledge of SQL and relational databases, including complex queries and performance tuning.
Experience with version control systems (e.g., Git) and CI/CD pipeline for data workflows.
Strong understanding of data governance, security practices, and compliance requirements.
Ability to manage multiple tasks and projects simultaneously, working in a fast-paced environment.
Excellent communication skills, both written and verbal, with the ability to explain complex technical concepts to non-technical stakeholders.
Preferred Skills:
Experience with Snowflake Data Sharing, Snowflake Streams & Tasks.
Familiarity with data orchestration tools like Apache Airflow or similar.
Experience with machine learning or data science workflows in Snowflake.
Certification in Snowflake or other relevant technologies.