What are the responsibilities and job description for the Data Engineer III position at Pekin Insurance?
You spend at least a third of your day at your job. You might as well spend it doing something you really love while working with a team you really enjoy being with, right? That’s the kind of atmosphere we offer at Pekin Insurance—fun, fast-paced, gratifying, supportive, and collaborative.
Of course, it’s not all fun and games. Insurance is a serious business, and we pride ourselves on making people’s lives whole again after a major disaster or even a fender bender. It’s that sense of helping people that makes our team want to do our best every day.
If you want to be excited about starting your workday and are ready to make a real difference in people’s lives, this could be the right spot for you.
See what Pekin Insurance has to offer by viewing a short video here.
POSITION OVERVIEW
The Data Engineer is responsible for designing and implementing the data and analysis infrastructure, as well as determining the appropriate data management systems for analysis. The Data Engineer builds, maintains, and optimizes data pipelines, and moves these data pipelines effectively into production for key data consumers. Data Engineer also provides data expertise when building and testing stories, features, and components; and participates in the development and management of application programming interfaces (APIs) that access key data sources. The Data Engineer works to guarantee compliance with data governance and data security requirements while creating, improving, and operationalizing integrated and reusable data pipelines to enable faster data access, integrated data reuse, and improve time-to-solution for data initiatives.
The Data Engineer contributes to the development of the team backlog and architectural runway, management of work in process (WIP) levels, and support of engineering aspects of program and solution Kanban. The Data Engineer may also participate in program increment planning, pre- and- post planning, system, and solution demos, and inspect and adapt events. In addition to the Data Engineer’s typical responsibilities, the Data Engineer could be expected to be part of the Agile Development Team.
ESSENTIAL JOB FUNCTIONS
- Participates and plays an active role in all Agile Team activities and is accountable for regularly producing product increments that effectively contribute to solution features and/or components
- Works closely with product teams to define product requirements
- Performs physical design and develops/evaluates product requirements related to data
- Builds and maintains highly complex data management systems that combine core data sources into data warehouses or other accessible structures
- Manages data pipelines consisting of a series of stages through which data flows
- Drives automation through effective metadata management
- Learns and uses modern data preparation, integration and artificial intelligence (AI)-enabled metadata management tools and techniques
- Performs data conversions, imports, and exports of data within and between internal and external software systems
- Ensures the development of programs to optimally extract, transform, and load data between complex data sources
- Creates data transformation processes (extract, transform, load (ETL), SQL stored procedures, etc.) to support the most complex business systems and operational data flows of critical importance to the organization
- Contributes to the design and management of APIs
- Designs, implements, and reviews processes to ensure data integrity and standardization
- Updates data dictionary
- Assists in maintaining the quality of Metadata Repository by adding, modifying, and deleting data
- Recommends and implements data reliability, efficiency, and quality improvements
- Ensures the collected data is within required quality standards
- Resolves highly complex conflicts between models, ensuring that data models are consistent with the enterprise model (e.g., entity names, relationships, and definitions)
- Reviews documentation of new and existing models, solutions, and implementations such as Data Mapping, Technical Specifications, Production Support, Data Dictionaries, Test Cases, etc.
- Ensures the troubleshooting, diagnoses, documentation, and resolution of escalated support problems
- Ensures support of innovative efforts by driving creativity, acting with agility, and thinking outside current boundaries
- Evaluates services provided by vendors and recommends changes
- Uses technology to implement automation and orchestration
- Performs other duties as assigned
EDUCATION & EXPERIENCE
Required
- Bachelor’s degree in Computer Science, Computer Engineering, or related discipline preferred, and/or equivalent/relevant experience.
Preferred or Specialized
- Experience in an agile environment strongly preferred
- Relevant certification is strongly preferred
CERTIFICATIONS & LICENSES
N/A
KNOWLEDGE, SKILLS & ABILITIES
In-depth ability to:
- Learn and use advanced analytics tools for object-oriented/object function scripting
- Work across multiple deployment environments including cloud and on-premises, and multiple operating systems
- Work with popular data discovery, analytics, and business intelligence (BI) software tools for semantic-layer-based data discovery
- Collaborate with both business and IT stakeholders
- Use judgment to form conclusions that may challenge conventional wisdom
- Consistently apply original thinking to produce new ideas and innovate
Skills Required:
- Typically requires 5-7 years of experience in coding for data management, data warehousing, or other data environments, including but not limited to SAS. SQL, ETL, Python, Snowflake, Workday Prism Analytics
- Strong written and oral communication skills required.
Good to Have Skills:
- 8 years of experience in coding for data management, data warehousing, or other data environments, including but not limited to SAS. SQL, ETL, Python, Snowflake, Workday Prism Analytics
- 3-7 years experience as a developer with top quadrant business intelligence tools, advanced analytics
- 1-3 years work-in in the insurance industry
In-depth knowledge of:
- Data management disciplines including data integration, modeling, optimization, and data quality, and/or other areas directly relevant to data engineering responsibilities and tasks
- Popular database programming languages for relational and non-relational databases
- Representational state transfer (RESTful) API design
- Agile methodologies and capable of applying DevOps principles to data pipelines to improve the communication, integration, reuse, and automation of data flows
Salary Range:
- $ 98,000 - $124,000 per year
- This range is based on the expected level of experience and skills for this position. Final compensation will depend on individual qualifications.
- This position is bonus eligible
- Benefits:
- Health, Dental and Vision Insurance
- Generous 401(k) with company match
- Paid Time Off (PTO) with Paid Holidays
- Flexible/Hybrid Work Schedule
- Paid Volunteer Program
For more information about the benefits we offer, please visit our Careers Page.
Salary : $98,000 - $124,000