What are the responsibilities and job description for the DataOps Databricks Architect position at VSG Business Solutions LLC?
Yes it is a lot like the Platform Engineer we were working on last year.
Key / critical components .strong Databricks experience with Unity Catalogs sharing data across multiple accounts cluster management borderline Databricks Architect. Also need to have Terraform or other IaC experience.
I called it DataOps because thats what it feels like or a super strong data driven DevOps. Who understands the infrastructure and architecture.
Client has been struggling to find people who are deep enough with Databricks. They all seem to have experience with it but not architect type.
Data Platform Infrastructure ( Data Infrastructure resources 12)
Capacity planning configure deploy and maintain Databricks clusters workspaces and Snowflake infrastructure on Azure cloud
- Use Terraform to automate provisioning and deploy Databricks clusters workspaces Snowflake and associated Azure resources. Ensure consistency and repeatability by treating infrastructure as code
- Monitor and optimize Databricks cluster performance and Snowflake resource utilization troubleshoot issues to ensure optimal performance and costeffectiveness.
- Implement and manage access controls and security policies to protect sensitive data.
- Develop environment strategies across the technology stack and governance based on best practices
- Provide technical support to Databricks and Snowflake users including troubleshooting and issue resolution.
- Implement and enforce security policies RBAC access controls and encryption mechanisms.
- Develop and maintain backup and disaster recovery strategies to ensure data integrity and availability.
- Collaborate with crossfunctional teams including data scientists data engineers and business analysts to understand their requirements and provide technical solutions
- Data Governance and Quality Management : Create and enforce data governance standards ensuring robust data quality and compliance through tools such as Databricks Unity Catalog Collibra and Snowflake Polaris.
- Enforce data governance data quality and enterprise standards supporting a robust production environment
- Required Experience :
- Experience in Data Platform Engineering : Proven track record in architecting and delivering cloudnative data solutions on Azure using Terraform Infrastructure as Code.
- Proficiency in Azure Databricks and Snowflake : Strong skills in data warehousing and lakehouse technologies with handson experience in Azure Databricks Delta Lake and Snowflake
Tooling Knowledge : Experience with version control (GitHub) CI / CD pipelines (Azure DevOps GitHub Actions) data orchestration and dashboarding tools.
For quick interview and submission please email me ALL of the following details :
1. First and Last name as it appears on your passport :
2. Anything we should know about you for presentation (this is our chance to showcase why this consultant is amazing compared to their competition) :
3. Reason you are looking for a change (detailed explanation or dont bother) :
4. Communication skills / CLevel interaction (110) :
5. Leadership skills / presence (110) :
6. Hourly rate allinclusive (1099 or C2C only at this time) :
7. US Work Status :
8. Resume in MS Word :
9. Education and pertinent certs degree year university :
10. Availability to start (onsite preferred but not required) :
11. Email and phone number :
12. LinkedIn Profile (must have pic) :
13. Are you TEAMS / video interview ready (Y / N) :
14. Current location (city & state) :
Key Skills
APIs,Pegasystems,Spring,SOAP,.NET,Hybris,Solution Architecture,Service-Oriented Architecture,Adobe Experience Manager,J2EE,Java,Oracle
Employment Type : Full Time
Vacancy : 1