What are the responsibilities and job description for the Data Engineer with Pyspark, Scala and GCP position at Novia Infotech?
Must Have Skills
4 years of recent GCP Experience
4 years PySpark Coding
4 years SQL
5 years of Hands-on experience Hadoop, Hive or Spark, Airflow or a workflow orchestration solutions.
4 years of hands-on experience designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc
Experience with scripting languages: Perl, Shell, etc
4 years of recent GCP Experience
4 years PySpark Coding
4 years SQL
5 years of Hands-on experience Hadoop, Hive or Spark, Airflow or a workflow orchestration solutions.
4 years of hands-on experience designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc
Experience with scripting languages: Perl, Shell, etc