What are the responsibilities and job description for the Data Architect(Databricks) position at Hecta Data LLC DBA Vilwaa?
Job Details
Job Title: Data Architect(Databricks)
Location: Minneapolis, MN
Job Description:
We are seeking an experienced Databricks Architect to join our team in Minneapolis, MN. The ideal candidate will have a strong background in Databricks, cloud platforms (AWS/Azure), and large-scale data transformation projects. This role requires expertise in data modeling, ERP systems, and migrating legacy tools (e.g., Teradata) to the cloud. The candidate should possess excellent communication skills, confidence, and a proven track record of delivering complex data solutions.
Key Responsibilities:
- Architectural Leadership:
- Design and implement scalable, high-performance data solutions using Databricks on cloud platforms (AWS/Azure).
- Lead large-scale data transformation and migration projects, ensuring best practices and architectural standards.
- Cloud Expertise:
- Architect and optimize data pipelines, data lakes, and warehouses on AWS or Azure.
- Migrate legacy systems (e.g., Teradata) to modern cloud-based platforms.
- Data Modeling & Integration:
- Develop and implement data models to support business requirements.
- Integrate Databricks with ERP systems and other enterprise applications.
- Collaboration & Communication:
- Work closely with stakeholders to understand business needs and translate them into technical solutions.
- Provide technical guidance and mentorship to team members.
- Innovation & Optimization:
- Identify opportunities to improve data processes, performance, and cost efficiency.
- Stay updated with the latest trends in Databricks, cloud technologies, and data architecture.
Require Skills & Qualifications:
- 15 years of experience in data architecture, engineering, or related fields.
- Proven experience in large-scale data transformation and migration projects.
- Hands-on experience with Databricks, AWS, and/or Azure.
- Prior experience working with ERP systems and data modeling.
- Experience migrating legacy tools (e.g., Teradata) to the cloud.
- Expertise in Databricks and cloud platforms (AWS/Azure).
- Strong knowledge of SQL, Python, and data integration tools.
- Familiarity with data lakehouse architecture and modern data stack.
- Excellent communication and interpersonal skills.
- Confidence in presenting technical solutions to stakeholders.
- Strong problem-solving and analytical thinking.