What are the responsibilities and job description for the Urgent Need Cloud ETL Developer - Local to Richmond, Virginia position at Vinsys Information Technology Inc?
Hope you're doing well. We have an open position for a Cloud ETL Developer . Pl. see the details below and let me know your interest. If interested, pl. Share a copy of your resume to hr@vinsysinfo.com along with your salary / rate expectations and the best time to reach you.
Role : Cloud ETL Developer
Client : VDOT State of VA
Location : Richmond, VA(Hybrid)
Job ID : 752580
Interview Mode : Both Phone and In Person
- Local Richmond, VA candidates ONLY required due to onsite requirement
- This position requires onsite 3 days a week with 2 remote
Job Description : SR ETL Developer
The Virginia Department of Transportation (VDOT)Information Technology Division (ITD) is seeking a Master Data Analyst with demonstrated experience in data analytics to work as a key member of Enterprise Data Asset team. This analyst will support teams working in Agile (Sprint) to analyze datasets to be made available in a cloud-based data management platform that will support the agency to produce master data with data governance.
Responsibilities include analyzing source systems which contain a spatial component for candidate datasets; documenting business processes and data lifecycle; developing data requirements, user stories and acceptance criteria; and testing strategies. Develop ETL to extract business data and spatial data and load it into a data warehousing environment. Design and test the performance of the system. Consult with various teams to understand the company's data storage needs and develop data warehousing options. Deep knowledge of coding languages, such as python, Java, XML, and SQL. Well-versed in warehousing architecture techniques such as MOLAP, ROLAP, ODS, DM, and EDW.
VDOT is a fast-paced organization with very high standards for work quality and efficiency. This position is expected to handle multiple projects, and remain flexible and productive, despite changing priorities and processes. Ongoing improvement and efficiency are a part of our culture, and each team member is expected to proactively contribute to process improvements.
Responsibilities :
Work with the Project team members and business stakeholders to understand business processes and pain points
Develop expertise in source system datasets and data lifecycle
Profile source data which may contain a spatial component; review source data and compare content and structure to dataset requirements; identify conflicts and determine recommendations for resolution
Conduct entity resolution to identify matching and merging and semantic conflicts
Elicit, record, and manage metadata
Diagram current processes and proposed modifications using process flows, context diagrams and data flow diagrams
Decompose requirements into Epics and Features and create clear and concise user stories that are easy to understand and implement by technical staff
Utilize progressive elaboration; map stories to data models and architectures to be used by internal staff to facilitate master data management
Identify and group related user stories into themes, document dependencies and associated business processes
Discover and document requirements and user stories with a focus on improving both business and technical processing
Assist Product Owner in maintaining the product backlog
Create conceptual prototypes and mock-ups
Collaborate with staff, vendors, consultants, and contractors as they are engaged on tasks to formulate, detail and test potential and implemented solutions
Perform Quality Analyst functions such as defining test objectives, test plans and test cases, and executing test cases
Coordinate and Facilitate User Acceptance Testing with Business and ensure Project Managers / Scrum Masters are informed of the progress
Designs and develops systems for the maintenance of the Data Asset Program(Data Hub), ETL processes, ETL processes for spatial data, and business intelligence.
Develop a new data engineering process that leverage a new cloud architecture and will extend or migrate our existing data pipelines to this architecture as needed.
Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data Marts.
Work closely with data analysts, data scientists, and other data consumers within the business in an attempt together and populate data hub and data warehouse table structure, which is optimized for reporting.
The Data developers partners with Data modeler and Data architect in an attempt to refine the business's data requirements, which must be met for building and maintaining Data Assets.
Qualifications Required :
Preferred Skills :
Computer Skills / MS Office / Software :
Skill
Required / Desired
Amount
Experience
Designs and develops systems for the maintenance of the Data Asset Program, ETL processes, and business intelligence.
Required
Years
Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data
Required
Years
Work closely with data analysts, data scientists, and other data consumers within the business in an attempt to gather and populate data hub and data
Required
Years
Advanced understanding of data integrations. Strong knowledge of database architectures, strong understanding of ingesting spatial data
Required
Years
Ability to negotiate and resolve conflicts, Ability to effectively prioritize and handle multiple tasks and projects
Required
Years
Excellent computer skills and be highly proficient in the use of Ms Word, PowerPoint, Ms Excel, MS Project, MS Visio, and MS Team Foundation Server
Required
Years
Experience with key data warehousing architectures including Kimball and Inmon, and has a broad experience designing solutions using a broad set of da
Required
Years
expertise in Data Factory v2,Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse
Required
Years
IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse.
Required
Years
Operating System Environments (Windows, Unix, etc.). Scripting experience with Windows and / or Python, Linux Shell scripting
Required
Years
Experience in AZURE Cloud engineering
Required
Years
Question 1
Commonwealth of Virginia security policies prohibit the use of offshore IT contractors. Do you attest to the fact that your candidate will physically reside within the US for the duration of the assignment?
Question 2
Please list candidate's email address.
Question 3
In what city and state does your candidate PERMANENTLY reside?
Question 4
Is the candidate available to work in Richmond, VA at least 3 days / wk? This is REQUIRED.
Question 5
How soon after an offer can your candidate start?