What are the responsibilities and job description for the Mid/Senior Level Data Engineer position at Fifth Dimension LLC?
Our noted client is hiring for "Mid/Senior level Data Engineer" roles for its Hyderabad (India) location with 6-8 yrs of experience extensively working in Python, Astronomer, Apache Airflow, SQL, Kafka.
Interested candidates with matching profiles expected to be available/joining within 60 days upon confirmation. Please post back your response with your resume to fifthdimenllc@gmail.com OR reach out to Parag Walimbe.
Responsibilities:-
#Design, build and maintain robust and scalable data pipelines using Apache Airflow and Astronomer.
#Collaborate with automation engineers and business stakeholders to develop and manage complex business workflows.
#Monitor the performance of Airflow DAGs to optimize for efficiency, reliability and scalability.
#Write clean, maintainable, and efficient Python code for data processing, transformation, and analysis tasks.
#Establish and promote best practices for workflow orchestration, data pipeline design, and coding standards within the team.
#Implement testing strategies for data pipelines and workflows, ensuring high availability and minimizing errors.
#Create and maintain comprehensive documentation wrt data pipelines, workflows and architectural designs for team knowledge sharing and on-boarding.
#Provide mentorship and guidance to junior developers and team members on best practices, tools, and data engineering concepts.
#Stay updated on emerging technologies and platforms related to data engineering and advocate for tool adoption and process enhancements.
Required:-
- Expert knowledge of Apache Airflow, including DAG creation, scheduling, and debugging.
- Proficiency in Astronomer in deploying and managing Airflow applications.
- Strong programming skills in Python, with experience in developing data processing applications and libraries.
- Familiarity with Azure cloud platform and services related to data processing and storage.
- Good understanding of distributed systems and experience building real-time integrations with Kafka.
- Experience with version control (e.g. Git) and CI/CD practices.
- Strong analytical and troubleshooting skills, with the ability to work independently as well as part of a collaborative team.
- Excellent communication skills- able to articulate technical concepts to technical and non-technical stakeholders.
Preferred:-
- 6 yrs of professional software development experience
- 4 yrs of Python, DAG and Airflow development
- 2 yrs of Cloud Experience preferably with Azure Cloud Platform
- 2 yrs of experience working in an Agile-based development environment using Continuous Integration & Continuous Delivery (CI/CD) and Test-Driven Development (TDD).
- Knowledge of containerization technologies (e.g. Docker) and orchestration tools (e.g. Kubernetes).
- Strong experience with monitoring and observability and building scalable services.
- Understanding of various database solutions (SQL, NoSQL).
- Knowledge of Azure, GCP