What are the responsibilities and job description for the Data Engineer position at Cordova?
Position Title : Sr. Data Engineer Analytics
Department : Data
Location(s) : Omaha, NE
Purpose :
We hiring a Sr. Data Engineer in our Enterprise Technology Group. The selected candidate will design, develop, test and maintain scripts and jobs that are required to extract, transform, clean and move data and meta data to be loaded into a data warehouse, data mart, or operational data store. Mentor less experienced designers and serve as a project technical lead for mid-sized data projects
Responsibilities :
Facilitate Agile Ceremonies : Lead and facilitate Scrum events such as Sprint Planning, Daily Scrums, Sprint Reviews, and Sprint Retrospectives.
Remove Impediments : Proactively identify and address any obstacles or challenges that hinder the team's progress, escalating issues as needed.
Coach and Mentor : Guide the team and organization in adopting and improving Agile practices, fostering a culture of self-organization and accountability.
Promote Collaboration : Encourage effective communication and collaboration within the team and with stakeholders.
Enforce Scrum Principles : Ensure the team adheres to Scrum values, principles, and practices.
Support the Product Owner : Assist the Product Owner in maintaining the product backlog and maximizing value.
Track Progress : Monitor team performance and progress, using metrics to identify areas for improvement.
Continuous Improvement : Facilitate retrospectives and implement action plans to continuously improve team processes and effectiveness.
Stakeholder Management : Communicate effectively with stakeholders, providing transparency and managing expectations.
Experience, Education, Skills :
Bachelor’s degree in Computer Science, MIS or related field.
8 years of IT experience, to include at least 6 years’ progressive experience in data warehousing, relational database management systems, and multi-dimensional database management systems. ?
6 years of SQL experience.
4 years’ experience using Python or Shell.
2 years’ experience with Snowflake & DBT
2 years of experience developing or deploying data solutions in any cloud such as AWS, Google or Azure required.
Demonstrate experience in data engineering with 8 years’ experience using ETL tools (e.g., Informatica, DataStage, or DBT).
Demonstrated knowledge of data warehouse design and data warehousing data population techniques for target structures (Star Schemas, Snowflake Schemas, etc.)