What are the responsibilities and job description for the Azure Databricks DevOps Administrator New York City, New York position at ESR Healthcare?
Azure Databricks DevOps Administrator New York City, New York
Description :
Job Location : New York City, New York - Hybrid (3 days onsite)
Job Title : Azure Databricks DevOps Administrator
Duration : Long term Contract / CTH
Target Start Date : ASAP
Experience
As an Azure Databricks DevOps Administrator, you will be responsible
for managing and maintaining the Databricks platform, ensuring optimal
performance and security across workspaces, clusters, and jobs. This
role is crucial in overseeing user administration, managing Azure
storage solutions, and implementing CI / CD pipelines using Azure
DevOps. By leveraging a deep understanding of Databricks architecture
and proficiency in scripting languages, the Azure DevOps administrator
will automate tasks and enhance the efficiency of data operations.
Strong communication skills and a commitment to high-quality support
will enable effective collaboration with cross-functional teams,
directly contributing to the department's mission of delivering robust
data solutions and insights.
Job Responsibilities
- Hands-on experience in Azure Cloud services, Networking Concepts,
Security - Cloud and on-premises system, Deployment using Azure
DevOps, Azure Cloud Monitoring and Cost controls and Terraform -
hands-on experience
Continuous Integration / Continuous Deployment (CI / CD) pipelines using
Azure DevOps and GitHub. Ensure the pipelines are efficient, reliable,
and scalable.
management of infrastructure, with a focus on Azure Databricks and
Azure Data Factory in a private network environment, using tools like
Terraform and ARM templates.
and production environments, ensuring consistency, security, and
alignment with organizational requirements.
pipeline, including secrets management, secure code scanning, and
compliance with security standards.
for applications and infrastructure using Azure Monitor, Log
Analytics, and related tools to ensure system reliability and
performance.
processes, reduce manual errors, and improve operational efficiency.
policies, pull request reviews, and pipeline approvals to maintain
code quality and compliance with organizational standards.
clusters, and jobs
permissions
enforcement
optimize and streamline data workflows, and analytical pipelines on
the Databricks platform
Lake Storage
ETL processes
best practices
spend
Minimum Qualifications
systems, or Engineering
strong focus on Azure. Knowledge of Data Platforms like Databricks,
Data Factory (preferred)
PowerShell, or Bash
technologies
Azure services
efficient manner
clearly
users and stakeholders
external customers
cross-functional teams
continuously improve skills to stay current in a rapidly evolving
field.