What are the responsibilities and job description for the Azure Data Engineer position at H & R COMPUTER CONSULTING SERVICES?
Benefits:
- Company parties
- Competitive salary
- Free food & snacks
Role: Azure Data Engineer
Location: Washington, DC
Client - World Bank (Only W2)
Job Summary
The Technical Specialist will be responsible for overseeing the Azure Data Factory (ADF), Azure Databricks, SQL, Python processes. They will play a key role in designing, developing, and maintaining data pipelines and ETL processes, leveraging the Azure cloud platform and related technologies to ensure optimized data transformation and storage. (1.) Key Responsibilities
1. Design, develop, and implement data pipelines using azure data factory (adf) to move data between various sources and data warehouses.
2. Utilize azure databricks for data transformations and analytics to derive valuable insights from data.
3. Write and optimize sql queries to extract, manipulate, and analyze data from multiple sources.
4. Develop and maintain python scripts for data processing and automation tasks.
5. Collaborate with cross functional teams to understand data requirements and ensure data solutions meet business needs.
6. Monitor data pipeline performance and troubleshoot issues to ensure data accuracy and reliability.
7. Stay updated on industry trends and best practices related to azure data factory, azure databricks, sql, and python.
Skill Set Requirements:
Location: Washington, DC
Client - World Bank (Only W2)
Job Summary
The Technical Specialist will be responsible for overseeing the Azure Data Factory (ADF), Azure Databricks, SQL, Python processes. They will play a key role in designing, developing, and maintaining data pipelines and ETL processes, leveraging the Azure cloud platform and related technologies to ensure optimized data transformation and storage. (1.) Key Responsibilities
1. Design, develop, and implement data pipelines using azure data factory (adf) to move data between various sources and data warehouses.
2. Utilize azure databricks for data transformations and analytics to derive valuable insights from data.
3. Write and optimize sql queries to extract, manipulate, and analyze data from multiple sources.
4. Develop and maintain python scripts for data processing and automation tasks.
5. Collaborate with cross functional teams to understand data requirements and ensure data solutions meet business needs.
6. Monitor data pipeline performance and troubleshoot issues to ensure data accuracy and reliability.
7. Stay updated on industry trends and best practices related to azure data factory, azure databricks, sql, and python.
Skill Set Requirements:
- Experience working with asset management business domain.
- Familiarity with change management tools – Azure DevOps, etc.
- Familiarity with audit requirements for financial applications
- Knowledge of Unix shell scripts
- Demonstrated proficiency in writing effective test cases
- Familiarity with database management systems (e.g., SQL) might be required for some roles involving data feeds or integrations.
- Knowledge of programming languages like Python or R and Linux scripting.
- Understanding APIs (Application Programming Interfaces) allows you to interact with external market data sources programmatically, pulling data into your applications.
- Database Management: Familiarity with database management systems SQL, Oracle, Postgres
- Programming Languages: Basic knowledge of programming languages like Python or R, Bash scripting
- Azure Cloud services: Understanding the basics of Azure services and solutions. Knowledge on Azure PostgreSQL DB, Azure Key Vault, Azure Active Directory (AD), Azure DevOps and Kubernetes containers
- API Development: Understanding APIs (Application Programming Interfaces) allows you to interact with external market data sources programmatically, pulling data into your applications.
- Work with Large Datasets: Python libraries make it easier to handle and manipulate large volumes of market data efficiently.
Salary : $55 - $60