What are the responsibilities and job description for the Cloud Data Engineer position at Agility Partners?
Agility Partners is seeking an experienced Azure Cloud/Data Engineer to join our growing Data & Analytics team with our client. In this role, you will design and deliver end-to-end cloud-based data solutions, playing a critical role in enabling data-driven decision-making across the organization.
This position offers a unique opportunity to work across modern data platforms in a dynamic, agile environment—leveraging tools like Azure Data Factory, Synapse Analytics, and PySpark, while helping the team transition to DevOps practices and CI/CD automation.
Key Responsibilities
- Design, build, and deploy scalable, cloud-based data solutions on Microsoft Azure
- Develop data ingestion, transformation, and provisioning pipelines for enterprise-scale analytics
- Implement DevOps best practices including CI/CD pipelines, GitHub, and infrastructure as code
- Ensure data quality with validation, testing, and reusable exception-handling mechanisms
- Automate, orchestrate, and monitor complex data workflows and pipelines
- Collaborate in agile teams to deliver creative and business-driven solutions
- Lead or contribute to the adoption of DevOps processes and continuous improvement initiatives
- Mentor junior developers and promote cross-team knowledge sharing
- Maintain compliance with applicable regulatory and security frameworks
Required Qualifications
- Bachelor’s degree in Computer Science, Engineering, or related field (or 4 years equivalent experience)
- 5 years of experience in data engineering or related analytics roles
- Proficient in SQL, Python, and PySpark
- Hands-on experience with Azure Data Factory, Synapse Analytics, and Azure Fabric
- Familiarity with Azure Monitoring Services, Application Insights, and Cost Management
- Experience with DevOps practices and version control (e.g., GitHub, CI/CD pipelines)
- Strong understanding of data lake/data warehouse architecture and best practices
- Knowledge of analytics tools such as Power BI or Tableau
Preferred Skills
- Experience with Kafka, Event Hubs, or other messaging/event frameworks
- Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes)
- Background in regulated industries (energy, utilities, etc.) a plus
This opportunity would start as a Hybrid-Onsite Contract then based on performance will shift to a remote Full-Time Opportunity.
Salary : $55 - $65