What are the responsibilities and job description for the Python ETL Developer (Airflow & Data Mesh) position at Jabil?
Job Family: General Information Technology
Job Profile Title: Development Operations Engineer / P03
Location/Division Specific Information – Lexington, KY
How will you make an impact? -
We are seeking a highly skilled Python ETL Developer to join our data engineering team. The ideal candidate will have extensive experience with ETL pipeline development using Apache Airflow and proficiency in implementing Data Mesh architecture. You will work with complex data sets, ensuring reliable data extraction, transformation, and loading processes. The role requires a strong background in Linux, SQL, PostgreSQL, and Kubernetes.
What will you do?-
How will you get here? -
Job Profile Title: Development Operations Engineer / P03
Location/Division Specific Information – Lexington, KY
How will you make an impact? -
We are seeking a highly skilled Python ETL Developer to join our data engineering team. The ideal candidate will have extensive experience with ETL pipeline development using Apache Airflow and proficiency in implementing Data Mesh architecture. You will work with complex data sets, ensuring reliable data extraction, transformation, and loading processes. The role requires a strong background in Linux, SQL, PostgreSQL, and Kubernetes.
What will you do?-
How will you get here? -
- Design, develop, and maintain ETL pipelines using Python and Apache Airflow.
- Implement Data Mesh architecture to support decentralized data ownership and domain-driven design.
- Automate data workflows, monitor data pipeline performance, and troubleshoot issues.
- Optimize SQL queries and data transformation processes to improve performance and reliability.
- Deploy, monitor, and manage data services on Kubernetes clusters.
- Collaborate with data scientists, analysts, and other engineers to integrate data solutions.
- Perform data quality checks and validation processes.
- Document ETL processes and data flow diagrams
- Bachelor’s degree in engineering or related field is required
- 5-8 years related experience is required
- Proven experience as a Python ETL Developer with Apache Airflow
- Strong understanding of Data Mesh principles and best practices
- Proficiency with SQL and PostgreSQL, including complex query optimization
- Experience working with Kubernetes for container orchestration and deployment
- Solid Linux command line and shell scripting skills
- Familiarity with CI/CD tools and modern data engineering practices
- Excellent problem-solving and troubleshooting abilities
- Strong communication and collaboration skills
- Postgres index optimization
- SQL query construction
- Kubernetes, Creation and Design of:
- Deployments
- Stateful Sets
- Persistent Volumes
- Services
- Ingresses
- ConfigMaps
- Secrets
- Experience with cloud platforms such as AWS, GCP, or Azure.
- Familiarity with data warehousing solutions and big data technologies (e.g., Spark, Hadoop).
- Experience with monitoring and observability tools (e.g., Prometheus, Grafana).
- Familiarity with version control using Git and CI/CD practices.