What are the responsibilities and job description for the Data Engineer position at Focus Financial Partners?
Position Summary
We are seeking a highly skilled Data Engineer to design, develop, and optimize scalable data pipelines and infrastructure within our cutting-edge data platforms. In this role, you will play a key part in ensuring seamless data workflows, enforcing rigorous data quality standards, and optimizing infrastructure for peak performance and cost efficiency. The ideal candidate brings deep expertise in modern data engineering practices, automation, and industry-leading data governance, contributing to a data-driven culture that powers strategic decision-making.
This role will be hybrid in St. Louis, MO.
Primary Responsibilities
Focus is a leading partnership of fiduciary wealth management and related financial services firms. Focus provides access to best practices, greater resources, and continuity planning for its affiliated advisory firms, which serve individuals, families, employers, and institutions with comprehensive financial services. Focus firms and their clients benefit from the solutions, synergies, scale, economics, and best practices offered by Focus to achieve their business objectives. For more information about Focus, please visit www.focusfinancialpartners.com.
The annualized base pay range for this role is expected to be between $110,000 - $140,000. Actual base pay could vary based on factors including but not limited to experience, subject matter expertise, geographic location where work will be performed and the applicant's skill set. The base pay is just one component of the total compensation package for employees. Other reward may include an annual cash bonus and a comprehensive benefits package.
We are seeking a highly skilled Data Engineer to design, develop, and optimize scalable data pipelines and infrastructure within our cutting-edge data platforms. In this role, you will play a key part in ensuring seamless data workflows, enforcing rigorous data quality standards, and optimizing infrastructure for peak performance and cost efficiency. The ideal candidate brings deep expertise in modern data engineering practices, automation, and industry-leading data governance, contributing to a data-driven culture that powers strategic decision-making.
This role will be hybrid in St. Louis, MO.
Primary Responsibilities
- Design, develop, and maintain scalable ELT pipelines to support efficient data processing.
- Develop and implement data models and transformation logic to enable analytics, reporting, and seamless data integration.
- Optimize Snowflake warehouse performance, including query tuning, resource management, and cost efficiency.
- Collaborate with data analysts, data scientists, and business teams to understand data requirements and deliver effective solutions.
- Ensure data reliability, consistency, and integrity through robust monitoring, testing, and quality assurance processes.
- Implement data governance best practices, including metadata management, access controls, and security compliance.
- Develop and maintain technical documentation for data architecture, transformations, and operational workflows.
- Support real-time and batch data processing by implementing appropriate pipeline strategies.
- Automate data ingestion and transformation processes to improve efficiency and streamline workflows.
- Stay current with industry best practices and emerging technologies to drive continuous improvement in data engineering.
- Expertise in modern cloud data platforms such as Snowflake or Databricks.
- Strong experience with Airflow or other modern workflow orchestration tools.
- Hands-on experience with dbt for data modeling, transformation, and version-controlled deployments.
- Proficiency in SQL and Python for developing scalable and modular data processing frameworks.
- Deep understanding of data modeling techniques to enhance storage efficiency and query performance.
- Familiarity with data pipeline monitoring tools and best practices for troubleshooting and improving workflows.
- Knowledge of AWS, GCP, or Azure and their associated data services.
- Strong understanding of data governance, security protocols, and compliance frameworks.
- Experience with streaming solutions such as Kafka or Kinesis is a plus.
- Bachelor’s or master’s degree in computer science, or a related field, or relevant experience.
- 5 years of experience in data engineering preferred, with a focus on modern data stack technologies.
- Demonstrated experience in designing, implementing, and maintaining reusable engineering frameworks for team-wide adoption.
- Proven experience designing and implementing scalable ETL/ELT frameworks.
- Strong track record of collaborating with cross-functional teams to optimize data workflows and quality.
- Excellent problem-solving, debugging, and troubleshooting skills.
- Passion for automation, performance optimization, and data-driven decision-making.
Focus is a leading partnership of fiduciary wealth management and related financial services firms. Focus provides access to best practices, greater resources, and continuity planning for its affiliated advisory firms, which serve individuals, families, employers, and institutions with comprehensive financial services. Focus firms and their clients benefit from the solutions, synergies, scale, economics, and best practices offered by Focus to achieve their business objectives. For more information about Focus, please visit www.focusfinancialpartners.com.
The annualized base pay range for this role is expected to be between $110,000 - $140,000. Actual base pay could vary based on factors including but not limited to experience, subject matter expertise, geographic location where work will be performed and the applicant's skill set. The base pay is just one component of the total compensation package for employees. Other reward may include an annual cash bonus and a comprehensive benefits package.
Salary : $110,000 - $140,000