What are the responsibilities and job description for the ETL Data Solutions Engineer position at Anchor Point Technology Resources?
Job Description
We are looking for a highly skilled and motivated Data/BI Engineer Sr with expertise in AWS cloud technologies to support automation, orchestration, and ETL processes across the entire data lifecycle, including transformation, management, retention, security, performance tuning, and SLA management. The Data/BI Engineer will collaborate with data owners, technical teams, and business units to identify key metrics and insights, developing reporting and visualization solutions. Responsibilities include requirements gathering, system design, data process development, and data delivery and visualization to meet business needs.
Key Responsibilities:
• Design and develop scalable cloud data pipelines using AWS technologies, such as Glue, for extracting, transforming, and loading (ETL) data from various sources into data lakes or warehouses.
• Create and maintain data models to support efficient querying and reporting, ensuring data integrity and consistency.
• Integrate data from multiple sources, including relational databases, APIs, and unstructured data, to provide a unified view for analysis.
• Develop interactive dashboards and reports in Microsoft Power BI to present data insights to stakeholders and facilitate informed decision-making.
• Optimize ETL processes and Power BI reports for performance, ensuring data availability and responsiveness.
• Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.
• Demonstrate proficiency in SQL and scripting languages such as Python or Scala.
• Build data visualizations to effectively illustrate trends, patterns, and outliers.
• Monitor and refine data pipelines and visualization tools to meet SLAs and delivery expectations.
• Provide timely updates to customers, business units, and staff regarding projects and tasks.
• Conduct proactive data reviews to ensure the quality and accuracy of reporting solutions.
• Ensure data processing and storage operations are conducted efficiently and in compliance with security guidelines.
Preferred Skills, Capabilities and Experiences:
• Over 5 years of experience in designing, developing, testing, and implementing enterprise ETL solutions, as well as process automation and orchestration.
• Proficiency with cloud data processing platforms and technologies, including AWS data pipeline toolsets.
• Expertise in data process orchestration and the design and development of near-real-time and batch data pipelines.
• Skilled in programming languages such as Python, Scala, SQL, R, and T-SQL.
• Ability to analyze, troubleshoot, and optimize SQL queries, with the capability to recommend improvements.
• Experience in analyzing and monitoring data processing resources, implementing proactive alerts and notifications based on SLAs.
• Possession of relevant certifications in data engineering/development platforms and related technologies.
• Experience in the healthcare claims processing industry with an understanding of data security and privacy considerations.
• Strong critical thinking and problem-solving skills.