What are the responsibilities and job description for the Sr. Data Engineer position at Everest Technologies, Inc.?
Job Summary
Step into a culture where data is embraced as an asset and everyone, from top executives to frontline team members, understands its importance in achieving strategic objectives. As a Senior Data Analytics Engineer on our hybrid agile scrum team, your responsibilities will be end-to-end encompassing design, analysis, build, orchestration and automation, monitoring performance optimizations, data quality, solution accuracy and compliance throughout the lifecycle.
What You’ll Do
• Responsible for architecting end-to-end data solutions that meet our business partners'
expectations and integrate into our DnA (Data and Analytics) Platform.
• Engage with the DnA team using scrum framework to comprehensively grasp requirements for each deliverable.
• Accountable to follow DnA best practices in solution design, build, orchestration and automation, data ingest, monitoring performance optimizations, data quality, solution accuracy and compliance throughout the lifecycle.
• Leverage a modern tool stack including Snowflake, Atlan, Fivetran, Docker, AWS, and Astronomer (Airflow) to cultivate an environment where analysts and data engineers can autonomously enact changes in an automated, thoroughly tested and high-quality manner.
• Lead design, development, prototyping, operations and implementation of data solutions and pipelines.
• Analyze the impact of changes to downstream systems/products and recommend alternatives to minimize the impact.
• Responsible for testing and release processes for data pipelines using best practices for frequent releases.
• Participate in Code Reviews
• Mentor and support DnA team members new to integrating the modern stack.
• Partner to deliver a data model that aligns with DevOps principles, ensuring and standards for continuous integration/ continuous delivery (CI/CD) processes.
• Drive Results by motivating self and others to exceed goals and achieve breakthrough results while exhibiting persistence to remove barriers to achieving results.
• Develop and maintain data documentation, including data dictionaries, data lineage, and data flow diagrams, best practices and data recovery processes to provide clear visibility into the data ecosystem.
What We Need
• 5 years of experience working on cloud data warehouses and data pipelines with a focus on data engineering, building scalable, sustainable and secure data platforms powering intelligent applications.
• Bachelor's degree in Informatics, Business Technology, Analytics, Computer Science or a related field.
• Advanced experience in a Data Engineering or ELT Engineering role.
• Expert-level proficiency in SQL query and stored procedure development.
• Competent with Python, Airflow, GitHub and DAG construction.
• Experience with unstructured datasets and ability to handle Parquet, JSON, AVRO and XML file formats
• Strong understanding of CI/CD principles, DevOps practices, software testing and quality
• Hands-on Experience with cloud-based data engineering and storage technologies such as AWS and orchestration tools such as Apache Airflow Astronomer.
• Advanced experience implementing data quality initiatives, monitoring, and auditing.
• Required Travel 3-5 times annually.
Desired Qualifications
• Advanced experience working with large data sets and streaming data.
• Intermediate experience with Snowflake.
• Basic experience with various patterns of data ingestion, processing, and curation along with various streaming data concepts, such as Kafka.
• Basic experience protecting PPI or PHI data during the ELT process, data security and data access controls and design.
• Exposure to industry standard BI tools like Power BI and Tableau.
• Healthcare / Medical Devices domain experience will be an added advantage.
Step into a culture where data is embraced as an asset and everyone, from top executives to frontline team members, understands its importance in achieving strategic objectives. As a Senior Data Analytics Engineer on our hybrid agile scrum team, your responsibilities will be end-to-end encompassing design, analysis, build, orchestration and automation, monitoring performance optimizations, data quality, solution accuracy and compliance throughout the lifecycle.
What You’ll Do
• Responsible for architecting end-to-end data solutions that meet our business partners'
expectations and integrate into our DnA (Data and Analytics) Platform.
• Engage with the DnA team using scrum framework to comprehensively grasp requirements for each deliverable.
• Accountable to follow DnA best practices in solution design, build, orchestration and automation, data ingest, monitoring performance optimizations, data quality, solution accuracy and compliance throughout the lifecycle.
• Leverage a modern tool stack including Snowflake, Atlan, Fivetran, Docker, AWS, and Astronomer (Airflow) to cultivate an environment where analysts and data engineers can autonomously enact changes in an automated, thoroughly tested and high-quality manner.
• Lead design, development, prototyping, operations and implementation of data solutions and pipelines.
• Analyze the impact of changes to downstream systems/products and recommend alternatives to minimize the impact.
• Responsible for testing and release processes for data pipelines using best practices for frequent releases.
• Participate in Code Reviews
• Mentor and support DnA team members new to integrating the modern stack.
• Partner to deliver a data model that aligns with DevOps principles, ensuring and standards for continuous integration/ continuous delivery (CI/CD) processes.
• Drive Results by motivating self and others to exceed goals and achieve breakthrough results while exhibiting persistence to remove barriers to achieving results.
• Develop and maintain data documentation, including data dictionaries, data lineage, and data flow diagrams, best practices and data recovery processes to provide clear visibility into the data ecosystem.
What We Need
• 5 years of experience working on cloud data warehouses and data pipelines with a focus on data engineering, building scalable, sustainable and secure data platforms powering intelligent applications.
• Bachelor's degree in Informatics, Business Technology, Analytics, Computer Science or a related field.
• Advanced experience in a Data Engineering or ELT Engineering role.
• Expert-level proficiency in SQL query and stored procedure development.
• Competent with Python, Airflow, GitHub and DAG construction.
• Experience with unstructured datasets and ability to handle Parquet, JSON, AVRO and XML file formats
• Strong understanding of CI/CD principles, DevOps practices, software testing and quality
• Hands-on Experience with cloud-based data engineering and storage technologies such as AWS and orchestration tools such as Apache Airflow Astronomer.
• Advanced experience implementing data quality initiatives, monitoring, and auditing.
• Required Travel 3-5 times annually.
Desired Qualifications
• Advanced experience working with large data sets and streaming data.
• Intermediate experience with Snowflake.
• Basic experience with various patterns of data ingestion, processing, and curation along with various streaming data concepts, such as Kafka.
• Basic experience protecting PPI or PHI data during the ELT process, data security and data access controls and design.
• Exposure to industry standard BI tools like Power BI and Tableau.
• Healthcare / Medical Devices domain experience will be an added advantage.