What are the responsibilities and job description for the Senior Data Engineer position at Steneral Consulting?
Hybrid work schedule - 2-3 days on site in Milwaukee.
Must be ether local or relocate prior to start, locals preferred
Need valid LinkedIn
Key Technical Skills
The Role
The Senior Data Engineer will focus on quality engineering best practices to meet and exceed internal and external client expectations. In this position, you will analyze, design, develop, test and document solutions supporting data integration, performance tuning, and data modeling to drive organization growth objectives. The Senior Data Engineer will define the standards for data architecture, platform architecture, and data quality and governance. This role is responsible for ensuring that the function is aligned with the overall CPI organization and continuously works to meet critical service levels in access, delivery and security.
What You Get To Do Everyday
Must be ether local or relocate prior to start, locals preferred
Need valid LinkedIn
Key Technical Skills
- Snowflake (2 years)
- Data warehousing architecture (8 years)
- Strong SQL (8 years)
- Strong Python (5 years)
- Power BI (1 year)
- Experience with AI/ML/genAI (2 years)
- Devops/DataOps experience (1 year)
The Role
The Senior Data Engineer will focus on quality engineering best practices to meet and exceed internal and external client expectations. In this position, you will analyze, design, develop, test and document solutions supporting data integration, performance tuning, and data modeling to drive organization growth objectives. The Senior Data Engineer will define the standards for data architecture, platform architecture, and data quality and governance. This role is responsible for ensuring that the function is aligned with the overall CPI organization and continuously works to meet critical service levels in access, delivery and security.
What You Get To Do Everyday
- Co-architect next-gen cloud data analytics platform.
- Increase operating efficiency and adapt to new requirements.
- Monitor and maintain the health of solutions generated.
- Support and enhance our data-ops practices.
- Provide task breakdowns, identify dependencies, and provide effort estimates.
- Model data warehouse entities in Erwin.
- Build data transformation pipelines with Data Build Tools (DBT).
- Evaluate the latest technological trends and develop proof-of-concept prototypes that align with CPI opportunities.
- Develop positive relationships with clients, stakeholders, and internal teams.
- Understand business goals, drivers, context, and processes to suggest technology solutions that improve CPI.
- Work collaboratively on creative solutions with engineers, product managers, and analysts in an agile-like environment.
- Perform, design, and code reviews.
- Perform other position-related duties as assigned.
- Bachelor’s degree in computer engineering, computer science, data science, or related field
- Seven years or more of experience working with data modeling, architecture and engineering
- Two years or more experience designing and implementing data warehouses in Snowflake
- Experience with all core software development activities, including requirements gathering, design, construction, and testing
- Experience performing data transformation using DBT
- Experience working with DQ products such as Monte Carlo, BigEye, or Great Expectations
- Experience with Azure DevOps (Repos, Pipelines, Boards, Wiki, Test Plans)
- Experience with formal software development methodologies including Software Development Life Cycle (SDLC), Agile or SCRUM
- Experience building high-performance and highly reliable data pipelines
- Experience Knowledge of data warehouse design patterns (star schema, data vault)
- Experience building dashboards with business integration tools
- Knowledge of DataOps
- with cloud-based compute, storage, integration, and security patterns
- Knowledge and understanding of RESTful APIs
- Knowledge of current data engineering trends, best practices, and standards
- Knowledge of SQL and Python
- Ability to work in a collaborative environment
- Ability to facilitate evaluation of technologies and achieve consensus on technical standards and solutions among a diverse group of information technology professionals
- Ability to work in an organization driven by continuous improvement or with an equivalent focus on process improvement
- Ability to manage multiple competing priorities and attain the best possible outcomes for the organization
- Excellent verbal and written communication and effective listening skills
- Experience in delivering an end-to-end data analytics platform using modern data stack components
- Experience working with artificial intelligence (AI) and machine learning (ML)
- SnowPro Advanced Certification
- DBT Analytical Engineer Certification