What are the responsibilities and job description for the GCP Data Engineer with DBA expertise - USC/GC/H4EAD/GC-EAD position at DRC Systems?
Job Description :
- Highly skilled GCP Data Engineer with DBA expertise to design, implement, and maintain data solutions using Google Cloud Platform (GCP). The ideal candidate will play a key role in managing and optimizing a Data Lake powered by BigQuery and AlloyDB. You will ensure the system's performance, scalability, and reliability while collaborating with cross-functional teams to support data-driven business decisions.
Design, develop, and maintain a Data Lake architecture using GCP services, particularly BigQuery and AlloyDB. Build robust data pipelines for ETL / ELT processes using tools like Dataflow, Cloud Composer, or other GCP services. Integrate data from various sources (structured and unstructured) into the Data Lake, ensuring consistency and reliability. Optimize BigQuery data models and queries for high performance and scalability.
Database Administration :
Administer and maintain AlloyDB instances, ensuring high availability, security, and performance. Perform backup, recovery, and disaster recovery planning for AlloyDB and BigQuery. Monitor and optimize database performance, query execution, and resource utilization. Manage schema design, indexing, partitioning, and clustering to enhance database efficiency. Apply database governance and compliance best practices to ensure data security and regulatory adherence.
Data Governance and Management :
Implement data governance policies, including role-based access controls, data masking, and encryption. Manage metadata, data cataloging, and data lineage tracking to support audit and compliance requirements. Conduct regular health checks on the Data Lake to ensure data quality and integrity.
Collaboration and Reporting :
Collaborate with data scientists, analysts, and application developers to define data requirements and deliver solutions. Design and deploy real-time and batch data analytics solutions using GCP's BigQuery and Looker. Provide documentation and training to teams on using Data Lake features effectively.
Automation and Monitoring :
Automate database and data pipeline tasks. Set up monitoring and alerting for BigQuery and AlloyDB using Cloud Monitoring and Cloud Logging. Resolve incidents and troubleshoot performance issues in real time.