What are the responsibilities and job description for the Snowflake Data Engineer position at Lupus Consulting Zrt.?
Position Summary:
We are seeking an experienced Snowflake Data Engineer to join our team. The ideal candidate will design, implement, and maintain scalable data pipelines and analytics solutions using the Snowflake Data Cloud platform. This role requires a strong understanding of data modeling, performance optimization, and integration techniques to support business intelligence and advanced analytics initiatives.
Key Responsibilities:
Data Pipeline Development:
- Design, develop, and maintain scalable and efficient ETL/ELT workflows for ingesting, transforming, and loading data into Snowflake.
- Integrate data from various sources such as APIs, on-premise databases, and cloud storage platforms.
Snowflake Architecture & Optimization:
- Configure and optimize Snowflake environments to ensure high performance, scalability, and cost efficiency.
- Implement Snowflake-specific features such as data sharing, clustering, and time travel.
Data Modeling:
- Design and implement data models (e.g., Star Schema, Snowflake Schema) to support business reporting and analytics.
- Develop and maintain a logical and physical data architecture for enterprise use.
Collaboration & Integration:
- Work closely with data analysts, data scientists, and stakeholders to understand data requirements.
- Collaborate with cross-functional teams to ensure seamless data integration and reporting.
Performance Monitoring & Troubleshooting:
- Monitor pipeline performance and resolve issues to ensure timely data availability.
- Implement governance, security, and access control standards.
Documentation:
- Maintain comprehensive documentation for data pipelines, architecture, and processes.
- Develop best practices for Snowflake usage and data engineering workflows.
Required Skills & Qualifications:
Technical Expertise:
- Strong hands-on experience with Snowflake Data Cloud (e.g., architecture, SnowSQL, and advanced features).
- Proficiency in SQL for complex queries, transformations, and optimizations.
- Experience with ETL/ELT tools like Informatica, Matillion, Talend, or similar.
- Familiarity with cloud platforms (AWS, Azure, or GCP).
Programming Skills:
- Knowledge of scripting languages like Python or Java for automation and data manipulation.
Data Modeling:
- Expertise in building scalable and optimized data models (e.g., Star and Snowflake schemas).
Analytical & Problem-Solving:
- Ability to troubleshoot data pipeline and performance issues efficiently.
Soft Skills:
- Strong communication and collaboration skills to work with diverse teams and stakeholders.
Preferred Qualifications:
- Experience with real-time data integration tools (e.g., Kafka, AWS Glue, or Azure Data Factory).
- Familiarity with BI tools such as Tableau, Power BI, or Looker.
- Certification in Snowflake or relevant cloud platforms is a plus.
Education & Experience:
- Bachelor's degree in Computer Science, Data Engineering, or a related field.
- 3 years of experience as a Data Engineer with a focus on Snowflake.