Demo

Cloud Data Engineer with Databricks

Diaconia LLC
Gaithersburg, MD Full Time
POSTED ON 4/21/2025
AVAILABLE BEFORE 6/21/2025
Description:Diaconia is looking for a talented Cloud Data Engineer to join our Amazing team!If you're looking to join a company that truly appreciates you and your talents, look no further! At Diaconia, we are committed to serving and caring for our colleagues, our clients and our community. Our team is made up of talented individuals who appreciate having the opportunity to contribute their knowledge and experience to further the growth and development of our industry. Our ideal candidates embrace diverse thinking, enjoy partnering with others and are seeking to make a difference!
We are currently searching for a new, full-time member for our team for the position of:Cloud Data Engineer with Databricks


A Cloud Data Engineer with Databricks experience and U.S. Citizenship required per Federal Requirements for our Federal client is responsible for managing, developing, and maintaining data infrastructure and pipelines, with a specific focus on Databricks, to support the agency's data-driven initiatives and regulatory responsibilities. U.S. government agency regulates and oversees financial products and services to protect consumers.


Job Summary: The Cloud Data Engineer with Databricks is responsible for designing, building, and maintaining data infrastructure and ETL pipelines using Databricks and cloud-based technologies. The role plays a crucial part in enabling data-driven decision-making and ensuring data accuracy and accessibility for regulatory purposes.


Key Responsibilities:

  1. Collaborate & contribute to the architecture, design, development, and maintenance of large-scale data & analytics platforms, system integrations, data pipelines, data models & API integrations.
  2. Prototype emerging business use cases to validate technology approaches and propose potential solutions.
  3. Data Pipeline Development: Design, develop, and maintain data pipelines using Databricks, Apache Spark, and other cloud-based technologies to ingest, transform, and load data from various financial institutions and sources.
  4. Data Transformation: Implement data transformation processes to ensure data quality, integrity, and consistency, meeting regulatory standards. Create transformation path for data to migrate from on-prem pipelines and sources to AWS.
  5. Data Integration: Integrate data from diverse sources, including financial databases, APIs, regulatory reporting systems, and internal data stores, into the CFPB's data ecosystem.
  6. Data Modeling: Develop and optimize data models for regulatory analysis, reporting, and compliance, following data warehousing and data lake principles.
  7. Performance Optimization: Monitor and optimize data pipelines for efficiency, scalability, and cost-effectiveness while ensuring data privacy and security.
  8. Data Governance: Ensure data governance and regulatory compliance, maintaining data lineage and documentation for audits and reporting purposes.
  9. Collaboration: Collaborate with cross-functional teams, including data analysts, legal experts, and regulatory specialists, to understand data requirements and provide data support for regulatory investigations.
  10. Documentation: Maintain comprehensive documentation for data pipelines, code, and infrastructure configurations, adhering to regulatory compliance standards.
  11. Troubleshooting: Identify and resolve data-related issues, errors, and anomalies to ensure data reliability and compliance with regulatory requirements.
  12. Continuous Learning: Stay updated with regulatory changes, industry trends, cloud technologies, and Databricks advancements to implement best practices and improvements in data engineering.
Disclaimer "The responsibilities and duties outlined in this job description are intended to describe the general nature and level of work performed by employees within this role. However, they are not exhaustive and may be subject to change or modification at any time to meet the evolving needs of the organizationRequirements:

Minimum Qualifications:

  • U.S. Citizenships as per Federal Requirements
  • Bachelor's or higher degree in computer science, data engineering, or a related field.
  • The Databricks Certified Data Engineer Professional certification is required.
  • Minimum of 3 years of experience in the following:
    • Strong understanding of data lake, lakehouse, and data warehousing architectures in a cloud-based environment.
    • Hands-on experience with Databricks including data ingestion, transformation, and analysis
    • Proficiency in Python for data manipulation, scripting, and automation
    • In-depth knowledge of AWS services relevant to data engineering such as Amazon S3, EC2, Database Migration Service (DMS), DataSync, EKS, CLI, RDS, Lambda, etc.
    • Understanding of data integration patterns and technologies.
    • Proficiency designing and building flexible and scalable ETL processes and data pipelines using Python and/or PySpark and SQL.
    • Proficiency in data pipeline automation and workflow management tools like Apache Airflow or AWS Step Functions.
    • Knowledge of data quality management and data governance principles.
    • Strong problem-solving and troubleshooting skills related to data management challenges.
    • Experience managing code in GitHub or other similar tools.
    • Experience leveraging Postgres in a parallel processing environment.
    • Hands-on experience migrating from an on-premise data platform(s) to a modern cloud environment (e.g. AWS, Azure, GCP)
    • Excellent problem-solving and communication skills
    • Strong attention to detail and the ability to work independently and collaboratively.

Preferred Skills:

  • Experience with financial data or regulatory data management.
  • Experience working in Agile or DevSecOps environments and using related tools for collaboration and version control.
  • Knowledge of regulatory frameworks in the financial industry.
  • Familiarity with DevOps and CI/CD practices.
  • Experience with machine learning and AI technologies.

Clearance requirements:

  • Must be able to obtain and maintain a Public Trust clearance.
  • Must be a verifiable a US Citizen for this Federal support position.

A Cloud Data Engineer with Databricks plays a vital role in ensuring data accuracy, integrity, and compliance with regulatory standards, supporting the agency's mission to protect consumers in the financial sector. The role demands expertise in Databricks, cloud technologies, and a deep understanding of data engineering principles within a regulatory context.


Applicant selected will be subject to a government security investigation and must meet eligibility requirements for access to classified information. Diaconia is an Equal Opportunity Employer, Minorities/Females/Veterans/Disabled. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, or national origin.

If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

What is the career path for a Cloud Data Engineer with Databricks?

Sign up to receive alerts about other jobs on the Cloud Data Engineer with Databricks career path by checking the boxes next to the positions that interest you.
Income Estimation: 
$92,369 - $122,605
Income Estimation: 
$117,024 - $149,811
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
Income Estimation: 
$71,122 - $96,652
Income Estimation: 
$92,929 - $122,443
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at Diaconia LLC

Diaconia LLC
Hired Organization Address Gaithersburg, MD Full Time
Description: Responsibilities Oversee all facets of human capital and manpower support activities for a joint-Service wo...
Diaconia LLC
Hired Organization Address Warner, GA Full Time
Description Position Summary : The candidate will provide financial management / comptroller support for the directorate...
Diaconia LLC
Hired Organization Address Warner, GA Temporary
Description Position Description : Provide product assistance to the PM and Product Support Manager (PSM) for the execut...
Diaconia LLC
Hired Organization Address Warner, GA Full Time
Job Description Job Description Description : Position Summary : The candidate will provide financial management / compt...

Not the job you're looking for? Here are some other Cloud Data Engineer with Databricks jobs in the Gaithersburg, MD area that may be a better fit.

Data Operations Engineer

Inclusion Cloud, Washington, DC

Data Engineer

Cloud BC Labs, Reston, VA

AI Assistant is available now!

Feel free to start your new journey!