What are the responsibilities and job description for the Lead Data Engineer with State client experience Local to Sacramento CA position at Trail Blazer Consulting LLC?
Job Details
Lead Data Engineer
Sacramento, CA (Day 1 On-Site- The Client will only consider local candidates of Sacramento, CA or within 1 hour of driving distance only)
2 Year Contract
Phone then Video Interview
Job Description:
We are looking for people with Unemployment Insurance (Labor force) Data experience with project valued over $50ML, data conversion / migration project experiences. If they don't have this experience, please don't submit because these skills/experiences are mandatory.
1 A minimum of four (4) years of experience, in developing and monitoring data pipelines, ETL/ELT (Extract Transform Load/Extract Load Transform).
2 A minimum of five (5) years of experience developing in SQL and one (1) year of experience translating between different versions of SQL.
3 A minimum of five (5) years of experience developing with ETL Tools including SSIS, SAS DI Studio, and AWS Glue, and Lambda.
4 A minimum of five (5) years of experience developing Rest APIs, including experience with JSON, C#, and Python.
5 A minimum of three (3) years of experience, in database design, database development, and database security.
6 A minimum of three (3) years of experience, working as a lead with data conversion and migration efforts involving legacy data formats and legacy data repositories such as IBM Mainframe.
7 A minimum of three (3) years of experience with Microsoft Visual Studio and Git Code Repositories.
8 A minimum of three (3) years of experience with Cloud Technologies in in AWS (e.g. glue and lambda) and Snowflake .
9 A minimum of five (5) years of experience working on a large-scale modernization projects involving data conversion and migration projects over $50 million in budget, designing and creating raw, integration, and presentation layers.
10 A minimum of three (3) year experience working with Unemployment Insurance and Disability Insurance data.
11 A minimum of three (3) years of experience designing and optimizing data warehouses and data lakes.
12 A minimum of three (3) years of experience with Terraform, Kubernetes, Docker, and CI/CD pipelines.
13 A minimum of three (3) years of experience in automating workflows and data pipelines.
14 A minimum of three (3) years of experience in identifying bugs and bottlenecks. Identifying poor performance and optimize ELT/ETL pipeline.
15 A minimum of three (3) years of experience working with JSON, API, CSV, SQL and Parquet files.
16 *Bachelor s Degree in computer science, engineering, information systems, math or a technology related field. Additional qualifying experience may be substituted for the required education on a year-for-year basis.
Salary : $50 - $60