What are the responsibilities and job description for the Snowflake Developer position at Digipulse Technologies, Inc?
Job Details
- Bachelors or Masters in a technology related field (e.g., Computer Science, Engineering etc.) required.
- 6 years of related experience in data engineering, analysis, data warehouses, data lakes. Specialist understanding and experience of methodologies like data warehousing, data visualization and data integration.
- A solid experience and understanding of designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
- Strong experience with relational database technologies (Oracle SQL & PL/SQL or similar RDBMS), preferably Snowflake.
- Strong expertise in all aspects of data movement technologies (ETL/ELT) and experience with schedulers.
- You should be having an experience in Data Lake implementation in Snowflake
- Knowledge and expertise of data modelling techniques and standard methodologies (Relational, Dimensional), plus any experience with Data modelling tools (e.g., PowerDesigner).
- Prior experience with Data ingestion tool sets (e.g., Apache NiFi, Kafka) is advantageous.
- Working experience with some or all of the following: AWS, Containerization, associated build and deployment CI/CD pipelines, Lambda development.
- Experience in Agile methodologies (Kanban and SCRUM) is a plus
- Experience with DevOps, Continuous Integration and Continuous Delivery.
- Able to work collaboratively with a geographically diverse team.
- Proven track record of working in collaborative teams to deliver high quality data solutions in a multi-developer agile environment following design & coding standard methodologies.
- Outstanding SQL skills and experience performing deep data analysis on multiple database platforms.
- Understanding data transformation and translation requirements and which tools to leverage to get the job done
- Prior experience in setting up reliable infrastructure (Hardware, Scalable data management systems, and frameworks) to perform data-related tasks, particularly with Kafka.
- Proven experience in understanding multi-functional enterprise data, navigating between business analytic needs and data, and being able to work hand-in-hand with other members of technical teams to execute on product roadmaps to enable new insights with our data.
- Strong Focus on resiliency & reliability.
- You have excellent written and oral communication skills.
- Nice to have: Scripting/coding experience in any of the following: Python, Unix, Java.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.