What are the responsibilities and job description for the Associate Partner-Kafka/Flink SME position at Ness Digital Engineering?
Associate Partner – Kafka/Flink Subject Matter Expert (SME)
This leadership role requires deep technical expertise and the ability to guide the strategy, architecture, and execution of complex streaming data solutions.
Key Responsibilities
- Leadership and Strategy:
- Lead the strategic direction for implementing Kafka and Flink-based solutions in the organization.
- Provide thought leadership on industry best practices, design patterns, and emerging trends in streaming data systems.
- Collaborate with business and technical stakeholders to align data streaming strategies with organizational goals.
- Technical Expertise:
- Design and architect complex data streaming systems using Apache Kafka and Apache Flink, ensuring scalability, performance, and reliability.
- Develop and implement solutions to optimize data flow, reduce latency, and enhance real-time data processing capabilities.
- Troubleshoot and resolve complex technical issues related to Kafka and Flink environments.
- Guide the technical teams in implementing best practices for stream processing, data modeling, and fault tolerance.
- Solution Design and Development:
- Lead the design, development, and deployment of Kafka and Flink pipelines for data integration, real-time analytics, and event-driven architectures.
- Collaborate with engineering teams to implement data pipelines that meet business requirements and data governance standards.
- Ensure that solutions are aligned with data security, privacy, and compliance requirements.
- Collaboration and Mentorship:
- Act as a mentor and guide for engineers and developers, providing expertise and fostering a culture of knowledge sharing.
- Collaborate with data scientists, analysts, and architects to design and optimize data processing workflows.
- Conduct training sessions and workshops to build internal capabilities in Kafka, Flink, and stream processing technologies.
- Performance and Optimization:
- Monitor and optimize the performance of Kafka and Flink clusters, ensuring high availability, fault tolerance, and scalability.
- Work closely with DevOps teams to automate deployment pipelines, ensuring smooth production rollouts and continuous integration/continuous delivery (CI/CD).
- Develop and enforce best practices around monitoring, logging, and alerting for Kafka/Flink-based systems.
- Continuous Improvement:
- Continuously assess the effectiveness of existing streaming data architectures and make recommendations for improvements.
- Stay current with advancements in the streaming data ecosystem and evaluate new tools and technologies to enhance capabilities.
- Stakeholder Communication:
- Present technical recommendations and solutions to senior leadership, ensuring alignment with broader business goals.
- Lead regular meetings with internal teams, including IT, data engineering, and product teams, to drive progress on Kafka and Flink initiatives.
Qualifications
- Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field.
- 8 years of experience in data engineering, software engineering, or a related technical field, with at least 4 years specializing in Kafka and Flink.
- Strong expertise in designing, building, and managing distributed data streaming systems using Apache Kafka and Apache Flink.
- Extensive experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and containerization technologies (e.g., Kubernetes, Docker).
- Deep understanding of stream processing, event-driven architectures, and messaging systems.
- Proven track record in leading complex projects, particularly in high-performance, real-time data processing environments.
- Proficiency in Java, Scala, Python, or similar programming languages commonly used in the Kafka/Flink ecosystem.
- Strong understanding of data modeling, data quality, and governance in streaming systems.
- Experience with CI/CD tools, version control systems (Git), and automation frameworks.
- Strong problem-solving and troubleshooting skills in a distributed system environment.
Desired Skills
- Experience with related technologies like Apache Kafka Streams, Apache Pulsar, Apache Beam, or similar.
- Knowledge of machine learning models deployed in streaming data pipelines.
- Familiarity with microservices architectures and cloud-native patterns.
- Leadership experience, including managing technical teams or guiding cross-functional project teams.
- Excellent communication skills and the ability to present complex technical concepts to both technical and non-technical stakeholders.
This position offers an exciting opportunity to influence the data architecture strategy of a growing organization while leveraging cutting-edge technologies in real-time data streaming.
Benefits
- Access to trainings, conferences, and certifications for self-development
- Skilled and team player colleagues
- Increased vacation days based on tenure in Ness and Floating holidays
- Medical insurance
- Attractive compensation scheme and referral bonuses