What are the responsibilities and job description for the Data Platform Engineer position at TekValue IT Solutions?
Data Platform Engineer
Hybrid- Phoenix, AZ
Long Term
Required Skills:
- More than nine years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
- At least four years experience in public clouds, with strong understanding of considerations for large-scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is strongly preferred versus other clouds.
- Hands-on experience with data cataloging, metadata management tools: Collibra, Dataplex, Alation. At least 2 years of experience with Data lake, Data governance and Metadata Management is required.
- Experience monitoring usage data for billing and SLA tracking. For example, extracting application telemetry data, structuring it, and sending it to a proper tool for reporting (Kafka, Splunk).
- Experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.
- Minimum five plus years of experience with Python with working knowledge on Notebooks.
- Minimum three years of hands on experience with Kafka, Pub/Sub, Docker, Kubernetes
- Ideally at least two plus years of hands-on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
- Strong understanding of relational and dimensional data modeling.
- Experience in DevOps and CI/CD related technologies.
- Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.