What are the responsibilities and job description for the Lead Data Engineer position at PSR Associates, Inc.?
Job Details
Lead Data Engineer
The selected candidate will work a hybrid schedule consisting of onsite work in Lansing, Michigan two (2) days per week and remote work three (3) days per week.
Job Description:
We are seeking a qualified candidate for the role of Lead Data Engineer. The position is responsible for providing ongoing maintenance and support of MDSS, a complex application that supports communicable disease surveillance, registries, and case management systems critical to supporting effective responses to public health emergencies and reducing the burden of communicable diseases. MDSS is undergoing modernization to enhance the stability and functionality of the system, with phase 1 already completed. The resource is integral to developing, maintaining, and enhancing MDSS phase 1, ensuring that automated processes are functioning, streamlining critical business processes, ensuring data integrity, SEM/SUITE compliance, and securing the application.
The resource also performs as a technical lead and provides technical guidance to other developers in the department. As a technical lead, the resource participates in a variety of analytical assignments that provide for the enhancement, integration, maintenance, and implementation of projects. The resource will also provide technical oversight to other developers on the team that support other critical applications.
Job Duties and Skills:
" Lead the design and development of scalable and high-performance solutions using AWS services.
" Experience with Databricks, Elastic search, Kibanna, S3.
" Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
" Write clean, maintainable, and efficient code in Python/Scala.
" Experience with AWS Cloud-based Application Development
" Experience in Electronic Health Records (EHR) HL7 solutions.
" Implement and manage Elastic Search engine for efficient data retrieval and analysis.
" Experience with data warehousing, data visualization Tools, data integrity
" Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
" Excellent knowledge in designing both logical and physical database model
" Develop database objects including stored procedures, functions,
" Extensive knowledge on source control tools such as GIT
" Develop software design documents and work with stakeholders for review and approval.
" Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
" Experience working on large agile projects.
" Experience or Knowledge on creating CI/CD pipelines using Azure Devops
Qualifications:
" 8 years Databricks.
" 8 years using Elastic search, Kibanna.
" 8 years using Python/Scala.
" 8 years Oracle.
" 5 years experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.
" 5 years experience with AWS.
" Over 5 years experience with data warehousing, data visualization Tools, data integrity
" Over 5 years using CMM/CMMI Level 3 methods and practices.
" Over 5 years implemented agile development processes including test driven development.
" Over 3 years Experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have
*** Please note that any false information on your resume or application could lead to the offer being withdrawn or even termination after hire.***