What are the responsibilities and job description for the Senior Data Engineer position at Heitmeyer Consulting?
Job Summary:
Heitmeyer has a banking client that has a need within their Chief Data Office for a Senior Data Engineer who can design and build data environments and develop queries for extracting and working with data across business units. The role is open for full remote; however, cannot be located in CA, MT, PA or VT or align to Pacific time zone.
Job Description:
The Senior Data Engineer is responsible for supporting the development and growth of data environments. Will design, build and operationalize modern data environments. Must be able to collaborate with architects, developers and LOB stakeholders on all data needs.
Top Required Skills:
Heitmeyer has a banking client that has a need within their Chief Data Office for a Senior Data Engineer who can design and build data environments and develop queries for extracting and working with data across business units. The role is open for full remote; however, cannot be located in CA, MT, PA or VT or align to Pacific time zone.
Job Description:
The Senior Data Engineer is responsible for supporting the development and growth of data environments. Will design, build and operationalize modern data environments. Must be able to collaborate with architects, developers and LOB stakeholders on all data needs.
Top Required Skills:
- REQUIRE very high proficiency with Python to design, develop and optimize scalable ETL/ELT data pipelines.
- Utilize Terraform for infrastructure as code (IaC) to manage and provision cloud resources efficiently.
- Manage and optimize GitLab CI/CD pipelines for automated deployment of data pipelines and infrastructure.
- Advanced SQL and PostgreSQL– REQUIRED.
- Experience with programming languages including Python, Java and C# required.
- Direct experience with ETL tools – Dataproc, DataFlow, DataStage, Data Fusion or similar tools
- Transformation tools such as DBT
- Strong background in tools and platforms like Terraform, Kafka for real-time data streams, Big Table, BigQuery
- Pipeline orchestration tools – Apache Airflow or Cloud Composer (Airflow)
- Independent resource with strong analytical and problem-solving skills who is able to communicate well through strong interpersonal skills.
- Experience working with Master Data Management tool – Reltio preferred.
- Background within financial services would be strongly preferred but not required.
- Should have worked within highly regulated industry.
- Work closely with Architects, IT and LOB partners to build, test, deliver, maintain, and optimize sustainable and highly scalable data pipelines.
- Utilize programming languages (Includes Python, Java, C#, Hadoop, Scala, Kafka and Cassandra) to support data program needs.
- Participate in prioritization of data science backlog and provide insights into viability and feasibility.
- Research data acquisition and evaluate suitability.
- Inform the development and growth of data environments, implementing solutions that are easily automated and work within MDM environment.
- Knowledge of data visualization tools – Tableau, PowerBI.