What are the responsibilities and job description for the DATA Engineer with Spark & Scala position at E-Solutions?
Role:DATA Engineer with Spark & Scala
Location:Bloomfield, CT (DAY-1 onsite)
Mandatory skills : Spark, Scala, Databricks, AWS Cloud
Responsibilities: -
DATA Engineer with Spark & Scala1scala,Spark,Databricks,Aws CloudN/AC2CUnited States
Location:Bloomfield, CT (DAY-1 onsite)
Mandatory skills : Spark, Scala, Databricks, AWS Cloud
Responsibilities: -
- 8 years of experience as a Data or Software Engineer developing with production code
- Proven track record of delivering solid, scalable and innovative solutions A strong understanding of data infrastructure and related technologies Experience with data modelling, data warehousing and builiding end to end ETL pipelines Proficiency in Scala.
- Deep understanding and experience with Apache Spark and related (DataFrame/Dataset) API's Working knowledge of different data storage formats (parquet, avro, json).
- Advanced SQL experience.
- Experience with the Databricks platform.
- Advanced experience in one or more scripting languages (shell, YAML, etc) CI/CD experience with Terraform, GitLab CI, Docker
- Experience with search technologies - Open Search Experience with AWS cloud technologies (S3, DynamoDB, RDS, Lambda) Streaming and event based system design
DATA Engineer with Spark & Scala1scala,Spark,Databricks,Aws CloudN/AC2CUnited States