What are the responsibilities and job description for the Azure Databricks DevOps Administrator position at 1 Point System?
Job Details
Role: Azure Databricks DevOps Administrator
Location: New York City, New York Hybrid (3 days onsite)
Duration: Long term Contract / CTH
Target Start Date: ASAP
Manager Notes-
Local candidates in NY and who can relocate on own expenses as it s a Hybrid role in NY
Minimum of Bachelor's degree in Computer science, Information systems, or Engineering
Experience: 7 years of professional experience in DevOps, with a strong focus on Azure. Knowledge of Data Platforms like Databricks, Data Factory (preferred)
Proficiency in scripting languages such as Python/PySpark, PowerShell, or Bash
Experience in automating administrative tasks and workflows
Experience with ETL processes, data pipelines, and big data technologies
Experience with backup and restore procedures for Databricks and Azure services
Experience
As an Azure Databricks DevOps Administrator, you will be responsible for managing and maintaining the Databricks platform, ensuring optimal performance and security across workspaces, clusters, and jobs. This role is crucial in overseeing user administration, managing Azure storage solutions, and implementing CI/CD pipelines using Azure DevOps. By leveraging a deep understanding of Databricks architecture and proficiency in scripting languages, the Azure DevOps administrator will automate tasks and enhance the efficiency of data operations. Strong communication skills and a commitment to high-quality support will enable effective collaboration with cross-functional teams, directly contributing to the department's mission of delivering robust data solutions and insights.
Job Responsibilities
Hands-on experience in Azure Cloud services, Networking Concepts, Security - Cloud and on-premises system, Deployment using Azure DevOps, Azure Cloud Monitoring and Cost controls and Terraform - hands-on experience
CI/CD Pipeline Management: Design, implement, and manage Continuous Integration/Continuous Deployment (CI/CD) pipelines using Azure DevOps and GitHub. Ensure the pipelines are efficient, reliable, and scalable.
Infrastructure as Code (IaC): Automate the provisioning and management of infrastructure, with a focus on Azure Databricks and Azure Data Factory in a private network environment, using tools like Terraform and ARM templates.
Environment Management: Create and manage development, testing, and production environments, ensuring consistency, security, and alignment with organizational requirements.
Security: Implement security best practices throughout the CI/CD pipeline, including secrets management, secure code scanning, and compliance with security standards.
Monitoring & Logging: Set up and maintain monitoring and logging for applications and infrastructure using Azure Monitor, Log Analytics, and related tools to ensure system reliability and performance.
Automation: Identify opportunities for automation to streamline processes, reduce manual errors, and improve operational efficiency.
Policy Enforcement: Establish and enforce policies such as branch policies, pull request reviews, and pipeline approvals to maintain code quality and compliance with organizational standards.
Manage and maintain Azure Databricks Platform, workspaces, clusters, and jobs
Oversee user administration including access controls and permissions
Handle library installations, runtime management, and policy enforcement
Implement/analyze cost control measures
Administer Unity Catalog for data governance and security
Collaborate with data engineers, data scientists, and analysts to optimize and streamline data workflows, and analytical pipelines on the Databricks platform
Manage Azure storage solutions, including Blob Storage and Data Lake Storage
Administer Azure Key Vault for secure storage of secrets and keys
Configure and manage Azure Data Factory for data integration and ETL processes
Implement and manage VNETs, firewalls, Azure policies and security best practices
Set up budgets and alerts to monitor and control Azure costs and spend
Configure alerts for proactive issue detection and resolution
Databricks Lakehouse Monitoring
Minimum Qualifications
Minimum of Bachelor's degree in Computer science, Information systems, or Engineering
Experience: 7 years of professional experience in DevOps, with a strong focus on Azure. Knowledge of Data Platforms like Databricks, Data Factory (preferred)
Proficiency in scripting languages such as Python/PySpark, PowerShell, or Bash
Experience in automating administrative tasks and workflows
Knowledge of security best practices and compliance requirements
Experience with ETL processes, data pipelines, and big data technologies
Experience with backup and restore procedures for Databricks and Azure services
Ability to troubleshoot and resolve issues in a timely and efficient manner
Strong verbal and written communication skills
Ability to document processes, procedures, and configurations clearly
Commitment to providing high-quality support to internal/external users and stakeholders
Ability to understand and address the needs of internal and external customers
Team player with the ability to work collaboratively with cross-functional teams
Flexibility to adapt to changing requirements and priorities
Continuous Learning: Eagerness to learn new technologies and continuously improve skills to stay current in a rapidly evolving field.