What are the responsibilities and job description for the Google Cloud Platform Data engineer position at Cloud Bigdata?
Job Details
Visa- OPT, L2
"5 years of recent Google Cloud Platform experience"
Experience building data pipelines in Google Cloud Platform Google Cloud Platform Dataproc, GCS & BIG Query experience
12 years of hands-on experience with developing data warehouse solutions and data products.
6 years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
5 years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi TB/PB scale).
Exposure to test driven development and automated testing frameworks.
Background in Scrum/Agile development methodologies.
Capable of delivering on multiple competing priorities with little supervision.
Excellent verbal and written communication skills.
Bachelor's Degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following:
Gitflow
Atlassian products BitBucket, JIRA, Confluence etc.
Continuous Integration tools such as Bamboo, Jenkins, or TFS