What are the responsibilities and job description for the Python/Spark Data Engineers position at InterSources Inc?
Job Title: Python/Spark Data Engineers
Job Location: Wilmington, DE (Initially Remote)
Job Type: Contract
Duration: 12+ Months
Project: Retail Banking Data Transformation Team. Currently using legacy (Ab Initio and Teradata and just migrated to AWS). Want to modernize all the data to Python, Spark, Scala, microservices, etc. Want to enable Data analysts and build products in a data ecosystem (metrics, dashboards, etc.). This new platform will bring all products into one place for anyone in Retail Bank to see, manipulate data, etc.
Requirements
Founded in 2007, InterSources Inc is a Small Business Enterprise (SBE), Minority Business Enterprise (MBE) & Women-Owned Small Business (WOSB) Certified Company specializing in providing IT Consulting, IT Staffing Solutions, and Software solutions. We have been recipients of Various Awards under "Fastest Growing IT Consulting and Software Company " and "Excellence in Technology Services "
Job Location: Wilmington, DE (Initially Remote)
Job Type: Contract
Duration: 12+ Months
Project: Retail Banking Data Transformation Team. Currently using legacy (Ab Initio and Teradata and just migrated to AWS). Want to modernize all the data to Python, Spark, Scala, microservices, etc. Want to enable Data analysts and build products in a data ecosystem (metrics, dashboards, etc.). This new platform will bring all products into one place for anyone in Retail Bank to see, manipulate data, etc.
Requirements
- Looking for data engineers to migrate legacy on-prem platform and modernizing to the cloud. A lot of Ab Initio data needs to be manipulated and moved to AWS.
- Python-MUST, Scala, Spark, AWS (ECS, EC2, EMR, Lambdas, etc.).
- If they don't have Spark but solid Python that's OK. If they don't have Python, but solid Java and Py-Spark, will consider that
- CI/CD pipeline with Jenkins
- 5+ years' experience with Data Engineering
- Good understanding of data and data analysis, how data flows, how data is joined, etc.
- As-Is and To-Be experience
- At least 3 years of experience developing Data Pipelines for Data Ingestion or Transformation using Java or Scala or Python
- At least 2 years' experience in the following Big Data frameworks: File Format (Parquet, AVRO, ORC, etc..)
- At least 3 years of developing applications with Monitoring, Build Tools, Version Control, Unit Test, TDD, Change Management to support DevOps
- At least 3 years of experience with SQL and Shell Scripting experience
- At least 2 years of experience with software design and must have an understanding of cross-systems usage and impact
Founded in 2007, InterSources Inc is a Small Business Enterprise (SBE), Minority Business Enterprise (MBE) & Women-Owned Small Business (WOSB) Certified Company specializing in providing IT Consulting, IT Staffing Solutions, and Software solutions. We have been recipients of Various Awards under "Fastest Growing IT Consulting and Software Company " and "Excellence in Technology Services "
Local Driver on Spark Driver™ App
Spark Driver -
Kennett, PA
Data Engineer ( Python )
Conch Technologies, Inc -
Wilmington, DE
Python Data Analyst
Aloden LLC -
Wilmington, DE