What are the responsibilities and job description for the Data Engineer (Entry Level) position at Wavicle Data Solutions?
A BIT ABOUT WAVICLE
Wavicle Data Solutions leverages Cloud, Data & Analytics technologies to deliver complex business & digital transformation solutions to our clients. As a Minority Business Enterprise (MBE) with a 40% women workforce, Wavicle fosters a diverse & equitable environment where innovative professionals come together as a team and enable our clients to realize their goals in their transformation journey. Our team members collaborate by infusing their creative problem solving skills, agile working & tech know-how to drive value for our clients.
At Wavicle, a Top Workplace award winner, you’ll find a challenging and rewarding work environment where our 500 team members based in US, India Canada work from 42 cities in a remote/hybrid, digitally connected way. We offer a competitive benefits package that includes: healthcare, retirement, life insurance, short/long-term disability, unlimited paid time off, short-term incentive plans (annual bonus) and long-term incentive plans.
WHY WAVICLE?
Watch here to learn: https://vimeo.com/654661550
THE OPPORTUNITY
Wavicle is hiring a Data Engineer with strong real-life experience in building data pipelines using emerging technologies.
\n- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of sources like Hadoop, Spark, AWS Lambda, etc.
- Experience with AWS Cloud on Data Integration with Apache Spark, EMR, Glue, Kafka, Kinesis and Lambda in S3, Redshift, RDS, and MongoDB/DynamoDB ecosystems.
- Strong real-life experience in Python development, especially in PySpark in AWS Cloud environment.
- Design, develop, test, deploy, maintain and improve data integration pipeline.
- Develop pipeline objects using Apache Spark / Pyspark / Python or Scala.
- Design and develop data pipeline architectures using Hadoop, Spark and related AWS Services.
- Load and performance test data pipelines built using the above-mentioned technologies.
- Bachelor or Master's degree in Computer Science, or related field is required.
- 2-5 years of hands-on professional work experience with cloud computing platforms (AWS, Azure, GCP).
- Hands-on experience using Python is required.
- Proficiency in SQL and NoSQL databases.
- Working experience on ETL pipeline implementation using AWS services such as Glue, Lambda, EMR, Athena, S3, SNS, Kinesis, Data-Pipelines, Pyspark, etc. is required.
- Experience with data processing frameworks and tools.
- Hands-on professional work experience using emerging technologies (Snowflake, Matillion, Talend, Thoughtspot and/or Databricks) is highly desirable.
- Knowledge or experience in data modeling and data architecture principles.
- Strong problem solving and troubleshooting skills with the ability to exercise mature judgement.
- At this time, we are looking for individuals interested in full-time, salaried employment (no contractors please).
- Health Care Plan (Medical, Dental & Vision)
- Retirement Plan (401k, IRA)
- Life Insurance (Basic, Voluntary & AD&D)
- Unlimited Paid Time Off (Vacation, Sick & Public Holidays)
- Short Term & Long Term Disability
- Employee Assistance Program
- Training & Development
- Work From Home
- Bonus Program
#LI-LN1
EQUAL OPPORTUNITY EMPLOYER
Wavicle is an Equal Opportunity Employer and committed to creating an inclusive environment for all employees. We welcome and encourage diversity in the workplace regardless of race, color, religion, national origin, gender, pregnancy, sexual orientation, gender identity, age, physical or mental disability, genetic information or veteran status.