What are the responsibilities and job description for the Sr: Data Engineer GCP position at Apolis?
Role: Sr: Data Engineer GCP
Duration: 6 Months
Location: Bentonville, AR
Rate: $50/hr on W2 OR $65/hr on C2C
Must have skills.
Data : 12 Years
GCP: 5 Years
Pyspark: 5 Years
Scala: 5 Years
Domain Experience (If any) – Retail Preferred but not mandatory
Job Description
Total IT Experience - 12 years of experience in IT GCP Experience
• 5 years of recent GCP experience
• Experience building data pipelines in GCP
• GCP Dataproc, GCS & BIGQuery experience
• 5 years of hands-on experience with developing data warehouse solutions and data products.
• 5 years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
• 5 years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
• Experience with programming languages: Python, Java, Scala, etc.
• Experience with scripting languages: Perl, Shell, etc.
• Practice working with, processing, and managing large data sets (multi TB/PB scale).
• Exposure to test driven development and automated testing frameworks.
• Background in Scrum/Agile development methodologies.
• Capable of delivering on multiple competing priorities with little supervision.
• Excellent verbal and written communication skills.
• Bachelor's Degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following:
• Gitflow
• Atlassian products - BitBucket, JIRA, Confluence etc.
• Continuous Integration tools such as Bamboo, Jenkins, or TFS
Duration: 6 Months
Location: Bentonville, AR
Rate: $50/hr on W2 OR $65/hr on C2C
Must have skills.
Data : 12 Years
GCP: 5 Years
Pyspark: 5 Years
Scala: 5 Years
Domain Experience (If any) – Retail Preferred but not mandatory
Job Description
Total IT Experience - 12 years of experience in IT GCP Experience
• 5 years of recent GCP experience
• Experience building data pipelines in GCP
• GCP Dataproc, GCS & BIGQuery experience
• 5 years of hands-on experience with developing data warehouse solutions and data products.
• 5 years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
• 5 years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
• Experience with programming languages: Python, Java, Scala, etc.
• Experience with scripting languages: Perl, Shell, etc.
• Practice working with, processing, and managing large data sets (multi TB/PB scale).
• Exposure to test driven development and automated testing frameworks.
• Background in Scrum/Agile development methodologies.
• Capable of delivering on multiple competing priorities with little supervision.
• Excellent verbal and written communication skills.
• Bachelor's Degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following:
• Gitflow
• Atlassian products - BitBucket, JIRA, Confluence etc.
• Continuous Integration tools such as Bamboo, Jenkins, or TFS
Salary : $2 - $65