What are the responsibilities and job description for the Databricks Tech Lead position at Trident Consulting?
Trident Consulting is seeking an "Databricks Tech Lead” for one of our clients in Hartford, CT/ Onsite. A global leader in business and technology services
Job Title: Databricks Tech Lead
Job Location: Hartford, CT/ Onsite – locals only
Job Type: Contract
Job Summary:
We are seeking an experienced Architect with 10 to 14 years of experience having knowledge in Amazon DynamoDB, Amazon SQS, Amazon Lambda, AWS S3, Python, AWS Services, Data Factory, Databricks CLI, Databricks Workflows, Databricks Delta Lake, Databricks SQL, and Databricks. Should have strong performance optimization skills to improve efficiency and reduce cost. Must have strong communication skills and have worked leading a team of size five plus.
Experience: 10 - 14 years
Required Skills: Amazon DynamoDB, Amazon SQS, Amazon Lamda, AWS S3, Python, AWS Services, Data Factory, Databricks CLI, Databricks Workflows, Databricks Delta Lake, Databricks SQL, Databricks
Certifications Required: AWS Certified Solutions Architect, Databricks Certified Data Engineer Associate
Responsibilities:
- At least 10 years of experience in data management projects and a minimum of 5 years of professional experience with Databricks
- Lead the design, development, and maintenance of Databricks-based data platforms and pipelines
- Technical Requirements Gathering and Development of Functional Specifications.
- Design, develop, and maintain scalable data pipelines and ETL processes using AWS Databricks, Data Factory, and other AWS services.
- Implement and optimize Spark jobs, data transformations, and data processing workflows in Databricks.
- Leverage DevOps and CI/CD best practices to automate the deployment and management of data pipelines and infrastructure.
- Must have excellent coding skills in either Python or Scala, preferably Python.
- Work with the architect and lead engineers for solutions to meet functional and non-functional requirements.
- Must have excellent coding skills in either Python or Scala, preferably Python.
- Must have implemented at least 2 projects end-to-end in Databricks.
- Experience working in MDM implementation is an asset
- Must have experience on Databricks, consisting of various components like Delta Lake, dbConnect, and db API 2.0
- Databricks workflow orchestration
- Must be well-versed with the Databricks Lakehouse concept and its implementation in enterprise environments.
- Must have a good understanding to create a complex data pipeline
- Must have good knowledge of Data structures & algorithms.
- Must have strong performance optimization skills to improve efficiency and reduce cost.
- Must have worked on both Batch and streaming data pipelines.
- Must have worked on AWS and most common services like S3, Lambda, CosmosDB/DynamoDB, SQS, Cloud databases.
- Must have strong communication skills and have worked leading a team of size five plus
About Trident:
Trident Consulting is a premier IT staffing firm providing high-impact workforce solutions to Fortune 500 and mid-market clients. Since 2005, we’ve specialized in sourcing elite technology and engineering talent for contract, direct hire, and managed services roles. Our expertise spans cloud, AI/ML, cybersecurity, and data analytics, supported by a 3M candidate database and a 78% fill ratio. With a highly engaged leadership team and a reputation for delivering hard-to-fill, niche talent, we help organizations build agile, high-performing teams that drive innovation and business success. Learn more: tridentconsultinginc.com.
Some of our recent awards include:
Trailblazer Women Award 2025 by Consulate General of India in San Francisco
Ranked as the #1 Women-Owned Business Enterprise in the large category by ITServe.
Received the TechServe Excellence award.
Consistently ranked in the Inc. 5000 list of fastest-growing private companies in America
Recognized in the SF Business Times as one of the Largest Bay Area BIPOC/Minority-Owned Businesses in 2022.