What are the responsibilities and job description for the Managing Consultant Data Engineer position at Excella?
Overview
As a Managing Consultant Data Engineer, you will lead the design and implementation of highly scalable, innovative, and robust data architectures. You will serve as a technical authority, providing strategic guidance and mentoring to teams while shaping and delivering enterprise-level data solutions. Your role will involve close collaboration with senior stakeholders, translating complex business requirements into actionable technical strategies that align with organizational goals.
Responsibilities
- Architect Enterprise-Grade Solutions: Design and deliver advanced data architectures, including data lakes, warehouses, and marts, leveraging cutting-edge cloud technologies and tools.
- Lead Strategic Initiatives: Define technical roadmaps, evaluate emerging technologies, and provide strategic leadership in modernizing data infrastructure for clients.
- Deliver Complex Data Pipelines: Oversee the development of high-performance data pipelines, enabling both batch and real-time data processing on platforms like AWS, Azure, or GCP.
- Foster Innovation: Evaluate and implement innovative technologies, frameworks, and tools, ensuring alignment with organizational goals and scalability requirements.
- Mentor and Guide Teams: Provide leadership, coaching, and technical mentorship to development teams, fostering a culture of excellence and continuous improvement.
- Stakeholder Engagement: Work closely with executives and business leaders to understand goals, communicate technical strategies, and ensure alignment of data solutions with business outcomes.
- Champion Agile Best Practices: Lead technical delivery using Agile frameworks, integrating DevOps practices such as CI/CD, test automation, and infrastructure as code.
- Data Governance: Establish and promote robust data governance practices, ensuring compliance, security, and quality across all data initiatives.
- Risk Assessment: Identify and mitigate risks associated with third-party tools, platforms, and evolving industry standards.
Qualifications
- Extensive Experience: 12 years of professional experience in data engineering or similar technical roles, with demonstrated success in designing and implementing large-scale data solutions.
- Advanced Technical Expertise:
- Deep knowledge of data lake and data warehouse architectures, including cloud-native solutions on AWS, Azure, and GCP.
- Expertise in modern data modeling approaches, including schema-less designs, Iceberg and Delta Lake table formats, and automated ETL/ELT workflows with real-time processing.
- Proven proficiency in programming languages such as Python, Java, and Scala.
- Extensive experience with streaming technologies, including Kafka, Spark Streaming, and Flink.
- Strong expertise in metadata management, data cataloging, and search frameworks to enhance data discoverability.
- Proficiency in working with in-memory databases (e.g., Redis, DuckDB), graph databases (e.g., Neo4j, Amazon Neptune, GraphQL), and developing Data-as-a-Service (DaaS) platforms to enable scalable, API-driven data access.
- Leadership Skills:
- Track record of leading cross-functional teams and mentoring engineers.
- Strong ability to drive consensus among diverse stakeholders, balancing technical and business priorities.
- Problem-Solving and Strategic Vision:
- Exceptional ability to abstract complex business problems into actionable technical solutions.
- Strategic thinker who can evaluate emerging trends and technologies to shape long-term solutions.
- Communication Excellence:
- Exceptional written and oral communication skills, with the ability to present complex concepts clearly to both technical and non-technical audiences.
- Experience in facilitating discussions and delivering impactful presentations.
Preferred Skills:
- Expertise in AI/ML workflows and integration with data platforms.
- Experience with data security, privacy regulations, and governance frameworks (e.g., GDPR, HIPAA).
- Familiarity with serverless computing and infrastructure-as-code frameworks (e.g., Terraform, CloudFormation).
- Experience designing self-service analytics and data visualization platforms.
- Understanding of DevOps Research and Assessment (DORA) and the capabilities within the DORA capability catalog is encouraged