What are the responsibilities and job description for the Data Integration Engineer position at Moore DM Group?
Description
As the Data Integration Engineer , you will play a pivotal role in building and optimizing global data integration processes. You will work closely with our data ingestion platform to write and maintain Scala code to feed ETL (Extract, Transform, Load) pipelines, ensure data consistency, and develop scalable solutions for integrating diverse datasets from multiple sources. Your work will contribute to ensuring high data quality, integrity, and accessibility for the entire enterprise across multiple use cases.
Moore is a data-driven constituent experience management (CXM) company achieving accelerated growth for clients through integrated supporter experiences across all platforms, channels and devices. We are an innovation-led company that is the largest marketing, data and fundraising company in North America serving the purpose-driven industry with clients across education, association, political and commercial sectors.
Check out www.WeAreMoore.com for more information.
Your Impact :
- Design and implement data integration solutions by writing and optimizing Scala code to support scalable data ingestion and transformation pipelines.
- Develop automated processes to ensure the consistency of incoming data across multiple sources and ensure alignment with universal data standards.
- Implement automated scripts and processes in Scala to cleanse and standardize incoming data to maintain consistency across datasets.
- Continuously enhance the performance and scalability of data ingestion processes to handle growing data volume and new data types.
- Ensure accurate data transformation from raw input to structured formats suitable for downstream application.
- Collaborate with cross-functional team members such as data architects, data scientists, data analysts, and operations teams to integrate new data sources and build efficient data conversion processes.
- Create comprehensive documentation of data integration solutions, best practices, and coding standards for both technical and non-technical audiences.
- Participate in Agile workflows involving large development efforts.
- Design and develop automated checks to ensure the completeness, accuracy, and validity of ingested data, identifying anomalies or inconsistencies as early as possible.
- Establish key performance indicators (KPIs) for data quality and implement processes to track and report these metrics regularly.
- Create robust validation frameworks that can automatically detect and flag issues like missing, duplicated, or incorrect data before it enters downstream systems and implement fallback mechanisms for error handling to maintain data integrity.
- Maintain a log of data quality issues, how they were addressed, and lessons learned to help improve future integration processes and ensure transparency across the organization.
- Conduct thorough testing and quality checks on new data sources before they are integrated into the data pipeline, ensuring alignment with internal data quality requirements.
- Collaborate with other teams to investigate and resolve data discrepancies between source systems and integrated data, ensuring any issues are quickly resolved.
Your Profile :
How We'll Support You :
LI-hybrid #LI-RF1