What are the responsibilities and job description for the Sr. Data Engineer position at recruit22?
Title: Sr Data Engineer
Location: Downtown Chicago
The mission of Oak Street Health is to rebuild healthcare as it should be.
We are a rapidly growing, innovative company of community-based healthcare centers delivering higher quality health and wellness care that improves outcomes, manages medical costs and provides an unmatched experience for adults on Medicare.
The Oak Street model integrates outstanding clinical expertise, technology, and teamwork to deliver improved care quality and cost savings. These cost savings are then reinvested into care in our communities, creating a virtuous cycle of improving community health.
We are a national organization serving over 100,000 patients and we are growing rapidly. We are a diverse team of care providers, service team members, technologists, community outreach experts, business professionals, and more -- all dedicated to our Oaky Values and motivated by our mission. We're looking forward to getting to know you!
Role Description:
The Data Engineer will be responsible for delivering high quality modern data solutions through collaboration with our engineering, analysts, data scientist, and product teams in a fast-paced, agile environment leveraging cutting-edge technology to reimagine how Healthcare is provided. You will be instrumental in designing, integrating, and implementing solutions as well supporting migrations of existing workloads to the cloud. The Data Engineer is expected to have extensive knowledge of modern programming languages, designing and developing data solutions.
Core Responsibilities:
Location: Downtown Chicago
The mission of Oak Street Health is to rebuild healthcare as it should be.
We are a rapidly growing, innovative company of community-based healthcare centers delivering higher quality health and wellness care that improves outcomes, manages medical costs and provides an unmatched experience for adults on Medicare.
The Oak Street model integrates outstanding clinical expertise, technology, and teamwork to deliver improved care quality and cost savings. These cost savings are then reinvested into care in our communities, creating a virtuous cycle of improving community health.
We are a national organization serving over 100,000 patients and we are growing rapidly. We are a diverse team of care providers, service team members, technologists, community outreach experts, business professionals, and more -- all dedicated to our Oaky Values and motivated by our mission. We're looking forward to getting to know you!
Role Description:
The Data Engineer will be responsible for delivering high quality modern data solutions through collaboration with our engineering, analysts, data scientist, and product teams in a fast-paced, agile environment leveraging cutting-edge technology to reimagine how Healthcare is provided. You will be instrumental in designing, integrating, and implementing solutions as well supporting migrations of existing workloads to the cloud. The Data Engineer is expected to have extensive knowledge of modern programming languages, designing and developing data solutions.
Core Responsibilities:
- Programming and modifying code in languages like SQL, Python, and Java to support and implement Cloud based and on-prem data warehousing services.
- Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
- Work closely with the Engineering teams to design best in class Azure implementations
- Participate in efforts to develop and execute testing, training, and documentation across applications
- 5 years of experience working with SQL and relational database management systems
- 5 years Hands-on experience with cloud orchestration and automation tools, CI/CD pipeline creation
- 3 years of relevant working experience with Azure
- 3 Experience in provisioning, configuring, and developing solutions in Azure Data Lake, Azure Data Factory, Azure SQL Data Warehouse, Azure Synapse and Cosmos DB
- Hands-on experience with Python, Javascript or PySpark
- Hands-on experience with dimensional data modeling, schema design, and data warehousing
- Understanding of Distributed Data Processing of big data batch or streaming pipelines
- Willingness to identify and implement process improvements, and best practices as well as ability to take ownership to work within a fast-paced, collaborative, and team-based support environment
- Familiarity with healthcare data and healthcare insurance feeds is a plus