What are the responsibilities and job description for the AWS Data engineer (W2) position at Brahma Consulting Group?
Job Details
Job Title: AWS Data Engineer
Location: Columbus, OH (3 days onsite, 2 days remote)
Position: Contract
Visa :W2
Job Description:
We are seeking a skilled and motivated AWS Data Engineer with expertise in Python, Spark, PySpark, and AWS to join our team. The ideal candidate should possess strong technical abilities and be capable of working in a dynamic environment. This role involves migrating data from a legacy platform to a new cloud-based platform using AWS technologies and building data pipelines in PySpark.
Responsibilities:
- Develop and maintain data pipelines utilizing PySpark (70%) and Java (30%).
- Collaborate with team members to ensure smooth data migration from legacy platforms to AWS-based solutions.
- Write efficient, scalable, and maintainable code for processing large datasets.
- Work with SQL to query and transform data as part of the data pipeline.
- Lead AWS migration efforts, ensuring best practices and standards are followed.
- Participate in the development of a modern Data Lake solution within AWS.
- Utilize AWS services for data processing and management (such as S3, Lambda, EC2, RDS, Redshift, etc.).
- Collaborate with other data engineers, architects, and stakeholders to design and implement robust data solutions.
- Provide technical leadership and mentorship to other team members.
Must-Have Skills:
- Python/Spark/PySpark expertise (Experience with both Python and Spark, 70% PySpark, 30% Java)
- Strong proficiency in SQL for querying and transforming large datasets.
- Deep knowledge of AWS technologies, including S3, Lambda, EC2, RDS, Redshift, and more.
- Experience with AWS migration projects is a must.
- Familiarity with Databricks is a plus (ability to discuss it in-depth if included on resume).
- Experience working with Data Lakes in AWS.
- Ability to create and maintain efficient data pipelines in a cloud environment.
Preferred Skills:
- Experience with AWS Databricks (If mentioned on resume, candidates should be able to discuss their experience in detail).
- Familiarity with the migration process from legacy platforms to cloud-based environments.
Key Requirements:
- Strong problem-solving and debugging skills.
- Ability to work independently and within a team.
- Excellent communication skills to collaborate with stakeholders and other team members.