What are the responsibilities and job description for the ETL Developer position at Swoon?
Swoon is partnering with a leading global airline to find a skilled ETL Developer for an exciting initial 8-month contract opportunity. This hybrid role is based in Chicago, IL and offers the chance to work with a high-performing team modernizing data pipelines and delivering enterprise-scale solutions that support critical business operations. With a strong potential for extension or conversion based on performance, this is a prime opportunity to join a globally recognized organization at the forefront of innovation in data engineering.
As an ETL Developer, you’ll be instrumental in designing, building, and optimizing complex data workflows using Python or PySpark, AWS Glue, and a suite of modern cloud technologies. You’ll collaborate with cross-functional teams to support scalable big data environments, develop efficient ETL solutions, and drive operational excellence through automation and observability.
If you're passionate about data engineering, enjoy working in dynamic environments, and bring expertise in AWS, SQL development, and big data processing, apply today to join a collaborative, future-focused team making a real impact in the aviation industry.
Here are the details:
Location: Chicago, IL (Must be Chicago location at this point we go to office every Tue/Wed alternate week but may change in future) No travel at this time.
Duration: Initial 8-month contract (through 1/30/2026) and high potential to extend/convert based on performance
Pay Rate: $65-70/hr W2 (W2 only)
Job #: 15271
Top 5 Skill sets
- Python or PySpark
- Complex SQL Development, debugging, optimization
- AWS – Glue, Step Functions,
- Knowledge of inner working of Databases – like AWS RDS MySQL
- Big Data Processing
Nice to have skills or certifications:
- Experience as a lead for decent sized ETL team
- Experience with Apache Iceberg
- Observability tools like Dynatrace or DataDog
Job Summary:
An ETL developer needs to design, build, test and maintain systems that extract, load and transform data from multiple different systems.
Primary Responsibilities:
- Leads, Designs, implements, deploys and optimizes backend ETL services.
- Support a massive scale enterprise data solution using AWS data and analytics services.
- Analyze and interpret complex data and related systems and provides the efficient technical solutions.
- Provide support to ETL schedule and maintain compliance to same.
- Develop and maintain standards to ETL codes and maintain an effective project life cycle on all ETL processes.
- Coordinate with cross functional teams like architects, platform engineers, other developers and product owners to build data processing procedures.
- Perform root cause analysis on production issues and perform routine monitoring on databases and provide support to ETL environments.
- Help create functional specifications, technical designs and working with business process area owners.
- Implement industry best practices code and configuration for production and non-production environments in an highly automated environment.
- Provides technical advice, effort estimate, impact analysis.
- Provides timely project status and issue reporting to management.
Qualifications:
- 6 years experience using ETL tools to perform data cleansing, data profiling, transforming, and scheduling various workflows.
- Expert level proficiency with writing, debugging and optimizing SQL.
- 3-4 years programming experience using Python or PySpark/Glue required.
- Knowledge of common design patterns, models and architecture used in Big Data processing.
- 3-4 years' experience with AWS services such as Glue, S3, Redshift, Lambda, Step Functions, RDS Aurora/MySQL, Apache Iceberg, CloudWatch, SNS, SQS, EventBridge.
- Capable of troubleshooting common database issues, familiarity with observability tools.
- Self-starter, responsible, professional and accountable.
- A ‘finisher’, seeing a project or task through to completion despite challenges.
Salary : $65 - $70