What are the responsibilities and job description for the Databricks Solution Architect position at Affine?
We are seeking an experienced Databricks Architect to design, implement, and optimize big data and cloud-based solutions using the Databricks platform. As a Databricks Architect, you will be responsible for creating scalable, efficient, and secure data architectures and workflows in a cloud-based environment, while ensuring best practices in data engineering, governance, and security.
The ideal candidate should have deep expertise in the Databricks ecosystem, strong experience with data processing frameworks (such as Apache Spark), and a solid understanding of cloud infrastructure. You will collaborate with cross-functional teams to design solutions that support large-scale data processing and machine learning workloads.
Key Responsibilities
The ideal candidate should have deep expertise in the Databricks ecosystem, strong experience with data processing frameworks (such as Apache Spark), and a solid understanding of cloud infrastructure. You will collaborate with cross-functional teams to design solutions that support large-scale data processing and machine learning workloads.
Key Responsibilities
- Lead the design and implementation of data solutions using Azure services, including Azure Synapse Analytics and Azure Databricks, specifically for cost optimization.
- Define the overall data strategy and architecture for cost optimization projects.
- Mentor and guide team members on data engineering best practices and cost optimization techniques.
- Collaborate with cross-functional teams to identify and prioritize cost-saving opportunities through data analysis.
- Evaluate and implement new data technologies to enhance performance and reduce costs.
- Ensure compliance with data governance standards and best practices across cost optimization projects.
- Extensive experience in data engineering, with a focus on Azure technologies and cost optimization projects.
- Proven leadership skills with a track record of successful project delivery focused on cost savings.
- Strong expertise in data modelling, ETL processes, and data governance.
- Excellent communication skills and the ability to collaborate with stakeholders at all levels.
- Pyspark
- Azure
- ADF
- Azure Synapse Analytics
- Databricks
- Solution Architecture
- Project management
- ETL
- SQL