What are the responsibilities and job description for the Azure Databricks Engineer - Primarily remote position at Lensa?
Lensa is the leading career site for job seekers at every stage of their career. Our client, MSys Inc, is seeking professionals. Apply via Lensa today!
Title
Job summary:
Azure Databricks Engineer - Primarily remote
Location:
Raleigh, NC, United States
Length And Terms
Long term - W2 or C2C
Position created on 04/08/2025 02:41 pm
Job Description
Interview Type:Webcam Interview Only *** Very long term project initial PO for 1 year, expect to go for 4 years *** Hybrid*** Locals or relocate
Azure Databricks Engineer who will work with existing staff to plan and design ETL pipelines and product solutions using Azure Databricks.
Job Description
The candidate will need to come onsite on the first day to collect equipment.
All candidates must be local to the Triangle region of North Carolina, and posting may require up to 1/2 days per month in a Triangle area office for meetings.
Azure Databricks Engineer who will work with existing staff to plan and design ETL pipelines and product solutions using Azure Databricks. The person filling this role will create resilient processes to ingest data from a variety of onprem and cloud transactional databases and APIs. Responsibilities will also include developing business requirements, facilitating change management documentation, and actively collaborating with stakeholders. This individual will work closely with a development technical lead and discuss all aspects of the design and planning with the development team.
Roles and Responsibilities
The recruiter working on this position is Rohit(Shaji Team) Bala
His/her contact number is His/her contact email is rohit@msysinc.com
Our recruiters will be more than happy to help you to get this contract.
Title
Job summary:
Azure Databricks Engineer - Primarily remote
Location:
Raleigh, NC, United States
Length And Terms
Long term - W2 or C2C
Position created on 04/08/2025 02:41 pm
Job Description
Interview Type:Webcam Interview Only *** Very long term project initial PO for 1 year, expect to go for 4 years *** Hybrid*** Locals or relocate
Azure Databricks Engineer who will work with existing staff to plan and design ETL pipelines and product solutions using Azure Databricks.
Job Description
The candidate will need to come onsite on the first day to collect equipment.
All candidates must be local to the Triangle region of North Carolina, and posting may require up to 1/2 days per month in a Triangle area office for meetings.
Azure Databricks Engineer who will work with existing staff to plan and design ETL pipelines and product solutions using Azure Databricks. The person filling this role will create resilient processes to ingest data from a variety of onprem and cloud transactional databases and APIs. Responsibilities will also include developing business requirements, facilitating change management documentation, and actively collaborating with stakeholders. This individual will work closely with a development technical lead and discuss all aspects of the design and planning with the development team.
Roles and Responsibilities
- Research and engineer repeatable and resilient ETL workflows using Databricks notebooks and Delta Live Tables for both batch and stream processing
- Collaborate with business users to develop data products that align with business domain expectations
- Work with DBAs to ingest data from cloud and on prem transactional databases
- Contribute to the development of the Data Architecture for NC DIT Transportation:
- By following practices for keeping sensitive data secure
- By streamlining the development of data products for use by data analysts and data scientists
- By developing and maintaining documentation for data engineering processes
- By ensuring data quality through testing and validatio
- By sharing insights and experiences with stakeholders and engineers throughout DIT Transportation
- Excellent interpersonal skills as well as written and communication skills. 5 Years
- Able to write clean, easy tofollow Databricks notebook code 2 Years
- Deep knowledge of data engineering best practices, data warehouses, data lakes, and the Delta Lake architecture 2 Years
- Good knowledge of Spark and Databricks SQL/PySpark 2 Years
- Technical experience with Azure Databricks and cloud providers like AWS, Google Cloud, or Azure 2 Years
- In depth knowledge of OLTP and OLAP systems, Apache Spark, and streaming products like Azure Service Bus 2 Years
- Good practical experience with Databricks Delta Live Tables 2 Years
- Knowledge of object oriented languages like C#, Java, or Python 7 Years
The recruiter working on this position is Rohit(Shaji Team) Bala
His/her contact number is His/her contact email is rohit@msysinc.com
Our recruiters will be more than happy to help you to get this contract.