What are the responsibilities and job description for the Associate Director, GCP Data & AI Technologist position at Brillio?
Role Brief – We are looking for a seasoned Senior GCP Data & AI Technologist to lead and drive engineering programs across teams, collaborate with senior stakeholders, and help shape our data strategy. This role requires deep technical expertise in Google Cloud Platform (GCP), data engineering, AI/ML, and a strong understanding of data governance, observability, and quality. Telecom domain experience is preferred, but not mandatory.
As a key leader in our Data & AI organization, you will work closely with cross-functional teams to define and implement data ingestion, data modelling, and governance best practices while ensuring high performance, scalability, and security.
\n- Strategic Leadership: Drive engineering programs and initiatives that enable data-driven business creation and transformation.
- Stakeholder Engagement: Act as a liaison between technical teams and senior leadership, translating business needs into scalable GCP-based data solutions.
- Consulting & Advisory: Provide expert guidance on data strategies, cloud modernization, and AI/ML adoption, ensuring alignment with business goals. Guide engineering teams on best practices and tool selection.
- Data Ingestion & Integration: Lead the design and implementation of efficient, scalable data ingestion pipelines using BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Apache Kafka.
- Data Modelling & Architecture: Define and implement best practices for dimensional modelling, real-time and batch processing, data lakehouse architectures, and data mesh principles.
- Data Quality & Observability: Establish frameworks for data validation, reconciliation, anomaly detection, and monitoring to ensure data reliability and trustworthiness.
- AI & ML Enablement: Collaborate with Data Science teams to operationalize AI/ML models, ensuring efficient deployment and monitoring with Vertex AI, BigQuery ML, and TensorFlow.
- Performance Optimization: Optimize cost, performance, and scalability of data workloads across GCP services.
- 13 years of experience in data engineering, data architecture, and AI/ML implementation, with at least 5 years working on GCP-based solutions.
- Expertise in BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Composer (Airflow), and Data Catalog.
- Strong understanding of SQL, Python, Scala, and Spark-based data processing frameworks.
- Hands-on experience with data lakes, data warehouses, and modern cloud-native architectures.
- Deep knowledge of data governance, lineage, security, and compliance frameworks.
- Experience with observability tools such as DataDog, Monte Carlo, or GCP-native monitoring solutions.
- Excellent communication skills, with the ability to engage and influence senior business and technology stakeholders.
- Telecom domain experience is a strong plus but not mandatory.
- GCP certifications (e.g., Professional Data Engineer, Machine Learning Engineer, or Cloud Architect) preferred.