What are the responsibilities and job description for the DataOps / Databricks Architect- Hybrid Seattle, WA position at Metanlytics?
Job Details
DataOps / Databricks Architect
6-12 months
Seattle, WA - locals get preference interviews
Please Do not send me mid-level or average players we need best of best only. And yes, I do mean soft skills, comm skills must be charming and smooth.
Key/critical components .strong Databricks experience with Unity Catalogs, sharing data across multiple accounts, cluster management borderline Databricks Architect. Also need to have Terraform or other IaC experience.
I called it DataOps because that's what it feels like or a super strong data driven DevOps. Who understands the infrastructure and architecture.
Client has been struggling to find people who are deep enough with Databricks. They all seem to have experience with it, but not architect type.
-
- Data Platform Infrastructure (Data Infrastructure resources 1-2)
- Capacity planning, configure, deploy and maintain Databricks clusters, workspaces and Snowflake infrastructure on Azure cloud
- Use Terraform to automate provisioning and deploy Databricks clusters, workspaces, Snowflake and associated Azure resources. Ensure consistency and repeatability by treating infrastructure as code
- Monitor and optimize Databricks cluster performance and Snowflake resource utilization, troubleshoot issues to ensure optimal performance and cost-effectiveness.
- Implement and manage access controls and security policies to protect sensitive data.
- Develop environment strategies across the technology stack and governance based on best practices
- Provide technical support to Databricks and Snowflake users, including troubleshooting and issue resolution.
- Implement and enforce security policies, RBAC, access controls, and encryption mechanisms.
- Develop and maintain backup and disaster recovery strategies to ensure data integrity and availability.
- Collaborate with cross-functional teams, including data scientists, data engineers, and business analysts to understand their requirements and provide technical solutions
- Data Governance and Quality Management: Create and enforce data governance standards, ensuring robust data quality and compliance through tools such as Databricks Unity Catalog, Collibra and Snowflake Polaris.
- Enforce data governance, data quality, and enterprise standards, supporting a robust production environment
- Required Experience:
- Experience in Data Platform Engineering: Proven track record in architecting and delivering cloud-native data solutions on Azure using Terraform Infrastructure as Code.
- Proficiency in Azure, Databricks and Snowflake: Strong skills in data warehousing and lakehouse technologies with hands-on experience in Azure, Databricks, Delta Lake and Snowflake
- Tooling Knowledge: Experience with version control (GitHub), CI/CD pipelines (Azure DevOps, GitHub Actions), data orchestration and dashboarding tools
- Data Platform Infrastructure (Data Infrastructure resources 1-2)