What are the responsibilities and job description for the DataOps Consultant position at Onebridge?
Onebridge is a Consulting firm with an HQ in Indianapolis, and clients dispersed throughout the United States and beyond. We have an exciting opportunity for a highly skilled DataOps Consultant to join an innovative and dynamic group of professionals at a company rated among the top “Best Places to Work” in Indianapolis since 2015.
Employment Type: Full Time or Contract
Location: Indianapolis, IN - Hybrid
Industry: IT & Services
DataOps Consultant | About You
As a DataOps Consultant, you are responsible for designing and implementing scalable data solutions that bridge technical and business needs. You bring both strategic thinking and hands-on expertise to automate, optimize, and maintain robust data workflows. Comfortable collaborating across teams, you work closely with engineers, analysts, and stakeholders to deliver high-quality data systems. Your deep knowledge of cloud platforms, orchestration tools, and data processing frameworks ensures both reliability and performance. You're proactive, detail-oriented, and driven by continuous improvement. Above all, you thrive on solving complex data challenges and enabling smarter, data-driven decisions.
DataOps Consultant | Day-to-Day
- Design, build, and optimize scalable data pipelines and architectures to support critical business needs.
- Collaborate with Data Engineers, Scientists, Analysts, and IT stakeholders to deliver end-to-end data solutions.
- Monitor and troubleshoot data workflows to ensure high availability, performance, and reliability across cloud environments.
- Automate data ingestion, transformation, testing, and deployment processes to streamline operations.
- Implement data quality checks, validation rules, and monitoring to improve system accuracy and trust.
- Contribute to architecture decisions, tool selection, and process improvements across the data platform.
DataOps Consultant | Skills & Experience
- 5 years of experience in DataOps, Data Engineering, or related roles, with a track record of delivering scalable, production-grade systems.
- Hands-on experience with cloud platforms (AWS, Azure, or GCP) for data storage, processing, and analytics.
- Proficiency in programming (e.g., Python, Java, or Scala) and infrastructure-as-code tools like Terraform or CloudFormation.
- Experience with orchestration platforms such as Apache Airflow, dbt Cloud, or Azure Data Factory.
- Knowledge of big data technologies (e.g., Spark, Kafka, or Hadoop) for real-time and batch processing.
- Familiarity with CI/CD, version control (e.g., Git), and DevOps practices for data pipeline automation.
A Best Place to Work in Indiana since 2015