What are the responsibilities and job description for the Databricks Architect position at Artech, LLC?
Job Details
**This role needs someone local in Portland, with strong background in Data Modelling along with extensive experience in Databricks and Hackolade.**
Databricks Architect with over 12 years of experience in data management and engineering, particularly with migration from Snowflake to Databricks. The ideal candidate will possess a deep understanding of Databricks, big data technologies, and cloud environments. This role involves designing scalable data solutions, optimizing data workflows, and collaborating with cross-functional teams to enhance data accessibility and usability.
Key Responsibilities:
- Design and implement scalable data architectures using Databricks and related technologies.
- Lead and manage the migration of data and workloads from Snowflake to Databricks.
- Strong background in Data Modelling and experience in Hackolade is a must
- Collaborate with data engineers, analysts, and business stakeholders to understand data requirements and deliver solutions that meet organizational goals.
- Optimize data pipelines and workflows for performance, reliability, and cost-effectiveness.
- Ensure data governance, quality, and security practices are embedded in all data processes.
- Conduct performance tuning and troubleshooting of Databricks applications.
- Stay current with industry trends and emerging technologies to continuously improve data management practices.
- Provide mentorship and guidance to junior team members.
Qualifications:
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- 12 years of experience in data management and engineering, with a strong focus on Databricks.
- Proven experience in migrating data and workloads from Snowflake to Databricks.
- Proficiency in big data technologies such as Apache Spark, Hadoop, and Kafka.
- Experience with cloud platforms (e.g., AWS, Azure, Google Cloud).
- Strong programming skills in languages such as Python, Scala, or SQL.
- Knowledge of data warehousing concepts and data modeling techniques.
- Excellent problem-solving skills and the ability to work collaboratively in a fast-paced environment.
- Strong communication skills, both written and verbal.
Preferred Qualifications:
- Databricks certification or relevant certifications in cloud technologies.
- Experience with machine learning frameworks and analytics tools.
- Familiarity with DevOps practices and CI/CD pipelines.