What are the responsibilities and job description for the Cloud Data Architect position at LatentView Analytics?
At LatentView Analytics, we are shaping the future of data-driven decision-making. As a leading global analytics and decision sciences provider, we deliver innovative solutions that empower businesses to harness the power of data and drive digital transformation.
Job Description
We are seeking an experienced Databricks Architect to join our growing data engineering team. The ideal candidate will have a strong background in designing, implementing, and managing cloud-based data solutions using Databricks and associated technologies. Key responsibilities include:
Data Pipeline Development
Design and maintain scalable and optimized data pipelines using Apache Spark, Databricks, and other cloud-based tools to ensure efficient data flow across systems.
Cloud Infrastructure
Collaborate with cloud providers (Azure, AWS) to design and implement cloud-native solutions for data storage, processing, and analytics, leveraging Databricks to drive business insights.
Collaboration with Data Scientists/Analysts
Work closely with data scientists, analysts, and business stakeholders to transform business requirements into data solutions and deliver meaningful insights.
Optimization
Continuously optimize Spark and Databricks workflows for performance and cost efficiency.
Required Skills and Qualifications
- Hands-on experience with Databricks (Apache Spark) for large-scale data processing and analytics.
- Strong experience working with cloud platforms such as Azure or AWS.
- Proficiency in SQL, Python, Scala, or Java for data processing and automation.
- Familiarity with data storage solutions like Delta Lake, Data Lakes, Azure Data Lake Storage, or AWS S3.