What are the responsibilities and job description for the GCP Data Engineer position at KTek Resourcing?
Note : Only Local to Dallas Profile
Summary :
You will create, deliver, and support custom data products, as well as enhance / expand team capabilities. They will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics. Google Cloud Data Engineers will be responsible for designing the transformation and modernization on Google Cloud Platform using GCP Services
Responsibilities :
- Build data systems and pipelines on GCP Cloud using Data proc, Data Flow, Data Fusion, Big query and Pub / Sub
- Implement schedules / workflows and tasks for Cloud Composer / Apache Airflow.
- Create and manage data storage solutions using GCP services such as BigQuery, Cloud Storage, and Cloud SQL
- Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring
- Develop efficient ETL / ELT pipelines and orchestration using Data Prep, Google Cloud Composer
- Develop and Maintain Data Ingestion and transformation processes using Apache PySpark, Dataflow
- Automate data processing tasks using scripting languages such as Python or Bash
- Ensuring data security and compliance with industry standards by configuring IAM roles, service accounts, and access policies. Automating cloud deployments and infrastructure management using Infrastructure as Code (IaC) tools such as Terraform or Google Cloud Deployment Manager.
- Participate in Code reviews, contribute to development best practices, and use Developer Assist tools to create a robust fail-safe data pipelines Collaborate with Product Owners, Scrum Masters and Data Analyst to deliver the User Stories and Tasks and ensure deployment of pipelines
Experience required :