What are the responsibilities and job description for the 529501194R2_Informatica administrator and data engineer position at NextGen Information Services, Inc.?
HHSC IT is continuing to develop an HHS data integration hub with a goal to accomplish the following :
- Implementation and configuration of the infrastructure for the data integration hub
- Design, development, and implementation (DD&I) of the data integration hub using an agile methodology for all standard SDLC phases that includes, but is not limited to :
- Validation of performance metric requirements
- Creation of Epics / User Stories / Tasks
- Automation of data acquisition from a variety of data sources
- Development of complex SQL scripts
- Testing - integration, load and stress
- Deployment / publication internally and externally
- Operations support and enhancement of the data integration hub
This development effort will utilize an agile methodology based upon the approach currently in use at HHSC for the Texas Integrated Eligibility Redesign System (TIERS). As a member of the agile development team, the worker responsibilities may include :
o ETL Administration with a focus on data warehousing and business intelligence solutions.
o Experience with Informatica IICS Administration, including installation, upgrades, and maintenance of the Informatica IICS environment.
o Knowledge of Power Exchange Change Data Capture (CDC) on both PowerCenter and IICS.
o Experience with a cloud background and knowledge of Continuous Integration and Continuous Deployment (CI / CD) tools such as GitHub and Jenkins.
o Ability to think innovatively and automate various administrative tasks in the admin space using Python and shell scripting skills.
o Excellent analytical skills to triage and resolve production issues and outages.
o Proven expertise in designing, developing, and deploying ETL pipelines using industry-standard tools like Informatica and IICS.
o Familiarity with a wide range of data sources : relational databases (e.g., Oracle, SQL Server, MySQL, Snowflake), flat files, and cloud platforms (e.g., AWS S3, Azure Blob).
o Experience in data quality, validation, and data migration projects.
o Linux / Unix experience.
II. CANDIDATE SKILLS AND QUALIFICATIONS
Minimum Requirements :
Candidates that do not meet or exceed the minimum stated requirements (skills / experience) will be displayed to customers but may not be chosen for this opportunity.
Years
Required / Preferred
Experience
Required
ETL Administrator with a focus on data warehousing and business intelligence solutions.
Required
Experience with Informatica IICS Administration, including installation, upgrades, and maintenance of the Informatica IICS environment.
Required
Knowledge of Power Exchange Change Data Capture (CDC) on both PowerCenter and IICS.
Required
Experience with a cloud background and knowledge of Continuous Integration and Continuous Deployment (CI / CD) tools such as GitHub and Jenkins.
Required
Ability to think innovatively and automate various administrative tasks in the admin space using Python and shell scripting skills.
Required
Excellent analytical skills to triage and resolve production issues and outages.
Required
Proven expertise in designing, developing, and deploying ETL pipelines using industry-standard tools like Informatica and IICS.
Required
Familiarity with a wide range of data sources : relational databases (e.g., Oracle, SQL Server, MySQL, Snowflake), flat files, and cloud platforms (e.g., AWS S3, Azure Blob).
Required
Experience in data quality, validation, and data migration projects.
Required
Linux / Unix experience.
Required
Technical writing and diagraming skills, including proficiency with modeling and mapping tools (e.g., Visio, Erwin), and the Microsoft Office Suite (Word, Excel, and PowerPoint) and MS Project.
Required
Experience on an agile sprint team
Required
Experience with JIRA software
Required
Experience working with multiple teams concurrently, being able to prioritize and complete work on time with high quality
Required
Knowledge of relational databases and data warehousing, including platforms like Oracle, Snowflake, SQL Server, and MySQL.
Required
Proficiency in SQL, Python and Bash.
Required
Familiarity with cloud ecosystems like AWS, Azure, or GCP and their respective data services.
Required
Data Quality & Modeling : Knowledge of data quality frameworks and data modeling techniques.
Preferred
Proven ability to write well designed, testable, efficient code by using best software development practices
Preferred
Understanding of security principles and how they apply to healthcare data
Preferred
Experience with state of the art software components for a performance metrics data visualization or business intelligence environment
Preferred
Excellent oral and written communication skills.
Preferred
Effectively manage multiple responsibilities, prioritize conflicting assignments, and switch quickly between assignments, as required.
Preferred
Bachelor's degree in Computer Science, Information Systems, or Business or equivalent experience.
Preferred
Prior experience in the Healthcare Industry
Preferred
Prior experience with an HHS agency
Preferred
Prior experience working with PII or PHI data
Preferred
Prior experience with Azure