What are the responsibilities and job description for the Databricks Consultant position at Saven Technologies?
Job Details
Hi,
Databricks Consultant
Location: Baltimore MD
Duration: thru Dec 2025
12 Months
Job Summary:
Sogeti is looking for a skilled and experienced Data Consultant to join our team. The ideal candidate will have a strong background in Databricks, PySpark, and Python, with hands-on experience designing and building ETL pipelines, working with big data frameworks, and supporting data analytics and reporting needs. This role involves close collaboration with business stakeholders, data engineers, and analysts to ensure data is properly ingested, transformed, and made accessible for insights.
________________________________________
Key Responsibilities:
- Design, develop, and optimize scalable ETL pipelines using Databricks, PySpark, and Python.
- Work with structured and unstructured data sources to enable data ingestion, transformation, and storage.
- Develop and maintain data lake and data warehouse environments.
- Collaborate with data analysts and BI developers to ensure seamless data availability for reporting and dashboards.
- Monitor data quality, reliability, and performance across the platform.
- Support data governance and metadata management initiatives.
- Work with stakeholders to gather requirements and translate them into technical specifications.
- Optimize data workflows for performance and cost-efficiency in cloud environments.
- Provide troubleshooting and support for data-related issues and outages.
________________________________________
Required Skills & Qualifications:
- 5 7 years of hands-on experience in data engineering or data consulting roles.
- Strong experience with Databricks, PySpark, and Python for data processing and transformation.
- Experience with ETL tools such as Azure Data Factory, Informatica, or Talend.
- Solid understanding of data modeling, data warehousing, and cloud data platforms (Azure, AWS, or Google Cloud Platform).
- Familiarity with SQL and ability to write complex queries for analysis and transformation.
- Exposure to reporting tools like Power BI, Tableau, or Looker (not expected to be a report developer, but should understand reporting needs).
- Strong problem-solving skills, with the ability to work independently and in teams.
- Excellent communication and stakeholder management abilities.
________________________________________
Preferred Qualifications:
- Experience with Delta Lake, Lakehouse architecture, or Apache Spark tuning.
- Understanding of data governance and compliance frameworks (GDPR, HIPAA, etc.).
- Exposure to CI/CD pipelines and version control tools (e.g., Git).
- Familiarity with REST APIs and data integration frameworks.
Salary : $45 - $50