What are the responsibilities and job description for the Cloud ETL Developer position at LingaTech?
Location: Richmond, VA - local candidates only
Position Type: Hybrid, 3 days per week onsite
Contract Length: 5 months
Position Overview:
This role requires a highly skilled data professional with 10 years of experience in data analysis, ETL processes, and cloud-based data management. The position involves developing and maintaining enterprise data assets, working collaboratively in an Agile environment, and leveraging advanced technologies like Azure Databricks, SQL Server, and ESRI ArcGIS to support the agency's data governance and reporting needs.
Duties:
Position Type: Hybrid, 3 days per week onsite
Contract Length: 5 months
Position Overview:
This role requires a highly skilled data professional with 10 years of experience in data analysis, ETL processes, and cloud-based data management. The position involves developing and maintaining enterprise data assets, working collaboratively in an Agile environment, and leveraging advanced technologies like Azure Databricks, SQL Server, and ESRI ArcGIS to support the agency's data governance and reporting needs.
Duties:
- Work with the Project team members and business stakeholders to understand business processes and pain points
- Develop expertise in source system datasets and data lifecycle
- Profile source data which may contain a spatial component; review source data and compare content and structure to dataset requirements; identify conflicts and determine recommendations for resolution
- Conduct entity resolution to identify matching and merging and semantic conflicts
- Elicit, record, and manage metadata
- Diagram current processes and proposed modifications using process flows, context diagrams and data flow diagrams
- Decompose requirements into Epics and Features and create clear and concise user stories that are easy to understand and implement by technical staff
- Utilize progressive elaboration; map stories to data models and architectures to be used by internal staff to facilitate master data management
- Identify and group related user stories into themes, document dependencies and associated business processes
- Discover and document requirements and user stories with a focus on improving both business and technical processing
- Assist Product Owner in maintaining the product backlog
- Create conceptual prototypes and mock-ups
- Collaborate with staff, vendors, consultants, and contractors as they are engaged on tasks to formulate, detail and test potential and implemented solutions
- Perform Quality Analyst functions such as defining test objectives, test plans and test cases, and executing test cases
- Coordinate and Facilitate User Acceptance Testing with Business and ensure Project Managers/Scrum Masters are informed of the progress
- Designs and develops systems for the maintenance of the Data Asset Program(Data Hub), ETL processes, ETL processes for spatial data, and business intelligence.
- Develop a new data engineering process that leverage a new cloud architecture and will extend or migrate our existing data pipelines to this architecture as needed.
- Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data Marts.
- Work closely with data analysts, data scientists, and other data consumers within the business in an attempt together and populate data hub and data warehouse table structure, which is optimized for reporting.
- The Data developers partners with Data modeler and Data architect in an attempt to refine the business’s data requirements, which must be met for building and maintaining Data Assets.
- 10 years of experience designing and developing systems for the Data Asset Program, ETL processes, and business intelligence maintenance.
- Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse (with 10 years of experience). Design and development of Data Marts.
- Work closely with data analysts, data scientists, and other data consumers within the business (with 10 years of data integration experience) to gather and populate data hub and data warehouse table structure.
- Advanced understanding of data integrations with 10 years of hands-on experience.
- Strong knowledge of database architectures with 10 years of specialized experience, strong understanding of ingesting spatial data.
- Ability to negotiate and resolve conflicts with a decade of professional experience.
- Ability to effectively prioritize and handle multiple tasks and projects over 10 years.
- Excellent computer skills and highly proficient in MS Word, PowerPoint, MS Excel, MS Project, MS Visio, and MS Team Foundation Server with 10 years of consistent usage.
- Experience with key data warehousing architectures including Kimball and Inmon, with 10 years of broad experience designing solutions using diverse data stores.
- Expertise in Data Factory v2, Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse (with 10 years of implementation experience).
- IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse with a decade of comprehensive knowledge.
- Operating System Environments (Windows, Unix, etc.) with 10 years of scripting experience in Windows and/or Python, Linux Shell scripting.
- 10 years of experience in AZURE Cloud engineering.