What are the responsibilities and job description for the Platform Architect position at Jobleads-US?
Job Title: Azure Databricks Architect
We are seeking an experienced Azure Databricks Architect to lead the design, implementation, and optimization of our enterprise-wide data platform.
About the Role
The ideal candidate will be responsible for creating a scalable, secure, and high-performance data ecosystem that enables advanced analytics, machine learning, and business intelligence capabilities.
Key Responsibilities:
- Create a scalable, secure, and high-performance data ecosystem using Azure Databricks, Azure Data Lake, Azure Event Hub, and related Azure data services.
- Develop and implement data Lakehouse architecture patterns that support both batch and real-time data processing.
- Create robust data ingestion pipelines leveraging Azure Databricks, Azure Data Factory, and other ETL technologies.
- Establish best practices for data governance, security, and compliance across the data platform.
- Create and maintain infrastructure-as-code (IaC) templates for reproducible and scalable data platform deployments.
- Provide architectural guidance and mentorship to data engineering and analytics teams.
- Ensure alignment of data platform architecture with overall enterprise architecture and business objectives.
- Foster collaboration with cross-functional teams including Development, Operations, Compliance, Security, etc.
Requirements
We require a minimum of 5 years of relevant work experience in addition to degree requirements. The ideal candidate should have expert-level knowledge of Python, SQL, Spark (PySpark), and Delta Lake.
Desired Skills:
- Strong understanding of data Lakehouse architecture principles, and cloud-native data processing and analytics solutions.
- Proficiency in infrastructure-as-code tools (Terraform, Github Actions) and integration with Github Enterprise for Code and Infrastructure deployments.
- Advanced knowledge of data modeling, performance tuning, and optimization techniques.
- Experience with design and implementation of ETL pipelines with Azure Databricks.
- Experience with design of Observability and Cost Management for the platform.
- Experience with design and implementation of Batch and Real-time streaming data pipelines.
- Experience with design and implementation of MLOps patterns for machine learning use cases.
- Experience migrating data and code to use the new architecture and operationalizing the platform.
- Experience with design and implementation of DR strategy for Databricks across multiple regions.
Education
Bachelor's Degree: Computer Science or related field is required, or a combination of education and experience that provides equivalent knowledge to a major in such fields is also acceptable.