What are the responsibilities and job description for the Data Solutions Architect (ETL) position at CAPGEMINI SINGAPORE PTE. LTD.?
We are seeking an experienced Data Architect who will play a crucial role in designing and implementing robust data architectures, leading ETL processes, and ensuring data quality, accuracy, and integrity. The ideal candidate will have a strong background in Big Data technologies and will be adept at developing efficient data pipelines and maintaining data warehouse structures.
Key Responsibilities
- Data Architecture & ETL Development : Design and implement robust data architectures to lead ETL processes, ensuring high data quality, accuracy, and integrity.
- Big Data Technologies : Utilize Apache Spark, Hadoop, and Kafka to process and analyze large datasets, optimizing pipelines for efficiency and speed.
- Automated Data Pipelines : Develop and maintain automated data pipelines, streamlining data flow across various systems with effective scheduling and monitoring mechanisms.
- Data Security & Compliance : Implement data security measures, ensuring compliance with regulatory requirements and collaborating with security teams to uphold privacy standards.
- Scalable Software Architectures : Design and implement highly scalable software architectures that efficiently accommodate increasing user loads, demonstrating a keen understanding of scalability principles.
- Data Warehousing : Manage and enhance data warehouse structures, including schema design, indexing strategies, and accommodating evolving business needs.
- Team Leadership : Lead and mentor a team of data engineers, overseeing the design, development, and maintenance of large-scale data pipelines.
- Cross-Functional Collaboration : Work closely with cross-functional teams to translate business needs into technical requirements, delivering solutions that drive business value.
Core Competencies
Technical Skills
J-18808-Ljbffr