What are the responsibilities and job description for the Data Engineer (data warehouse, Azure Data Factory, SQL) position at Goli Tech?
Job Description
Job Description
100% TELECOMMUTE
Interview Process : 3 rounds via video. Highly technical.
Project : As a member of the Optum Data Management team, the Data Engineer supports the Alabama EDS by developing and maintaining workflows, identifying, and resolving data quality issues, and optimizing processes to improve performance. The Data Engineer will also support intrastate agencies by monitoring automated data extracts and working directly with state partners to create new extracts based on business specifications.
Responsibilities :
- Develop and manage effective working relationships with other departments, groups, and personnel with whom work must be coordinated or interfaced
- Efficiently communicate with ETL architect while understanding the requirements and business process knowledge in order to transform the data in a way that's geared towards the needs of end users
- Assist in the overall architecture of the ETL Design, and proactively provide inputs in designing, implementing, and automating the ETL flows
- Investigate and mine data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions
- Developing ETL pipelines and data flows in and out of the data warehouse using a combination of Azure Data Factory and Snowflake toolsets
- Developing idempotent ETL process design so that interrupted, incomplete, or failed processes can be rerun without errors using ADF dataflows and Pipelines
- Ability to work in Snowflake Virtual Warehouses as needed in Snowflake and automate data pipelines using Snowpipe for tedious ETL problems
- Capturing changes in data dimensions and maintaining versions of them using Stream sets in snowflake and scheduling them using Tasks
- Optimize every step of the data movement not only limited to source and during travel but also when it's at rest in the database for accelerated responses
- Must have the ability to build a highly efficient orchestrator that can schedule jobs, execute workflows, perform Data quality checks, and coordinate dependencies among tasks
- Responsible for testing of ETL system code, data design, and pipelines and data flows. Root cause analysis on all processes and resolving production issues are also a part of the process and routine tests on databases and data flow and pipeline testing
- Responsible for documenting the implementations, and test cases as well as responsible for building deployment documents needed for CI / CD
Ideal Background : Data Engineer with Healthcare (Medicaid) and Microsoft Azure based experience with Snowflake and Azure Data Factory
Top 3 Requirements :
Required :
Preferred :
Required Skills : ETL
Basic Qualification :
Additional Skills :
Background Check : No
Drug Screen : No