What are the responsibilities and job description for the Senior/Staff Backend Engineer - Kafka position at Oscilar?
← Back to Careers
Senior/Staff Backend Engineer - Kafka
Location
Remote (US/Canada/EU)
Background
We are looking for a Senior/Staff Backend Engineer with deep expertise in backend development. In this role, you will design, implement, and optimize services that leverage Apache Kafka to handle high-throughput, real-time data streams. You will also be responsible for scaling and maintaining databases such as Postgres, Redis, DynamoDB, and ClickHouse, all within a cloud-based AWS infrastructure.
This is a senior technical leadership role, where you will collaborate across teams, mentor engineers, and drive the scalability, performance, and reliability of Oscilar’s backend systems.
About Oscilar
Oscilar is redefining risk decisioning and fraud prevention with scalable, secure, and high-performance technology. We empower global businesses by providing cutting-edge solutions designed to process and analyze real-time data streams with unparalleled reliability and speed.
What We Offer
Responsibilities
Technical Expertise
To apply, send your resumé to us via email at eng-careers@oscilar.com.
Senior/Staff Backend Engineer - Kafka
Location
Remote (US/Canada/EU)
Background
We are looking for a Senior/Staff Backend Engineer with deep expertise in backend development. In this role, you will design, implement, and optimize services that leverage Apache Kafka to handle high-throughput, real-time data streams. You will also be responsible for scaling and maintaining databases such as Postgres, Redis, DynamoDB, and ClickHouse, all within a cloud-based AWS infrastructure.
This is a senior technical leadership role, where you will collaborate across teams, mentor engineers, and drive the scalability, performance, and reliability of Oscilar’s backend systems.
About Oscilar
Oscilar is redefining risk decisioning and fraud prevention with scalable, secure, and high-performance technology. We empower global businesses by providing cutting-edge solutions designed to process and analyze real-time data streams with unparalleled reliability and speed.
What We Offer
- Opportunity to work on cutting-edge technology in the fintech and fraud prevention space.
- Collaborative environment with a team of brilliant engineers, data scientists, and security experts.
- Competitive salary and benefits package.
- Professional growth and learning opportunities.
- Flexible work arrangements, including remote work options.
Responsibilities
- Design, develop, and maintain scalable backend services using Java and AWS technologies.
- Lead the architecture, deployment, and optimization of Apache Kafka to support real-time data streaming across distributed systems.
- Build and manage Kafka topics, brokers, producers, and consumers, ensuring optimal performance and data consistency.
- Implement streaming solutions with Kafka Streams and Kafka Connect, focusing on high availability and low-latency processing.
- Collaborate with product, frontend, and data engineering teams to define technical requirements and deliver reliable, performant services.
- Design and maintain high-performance data storage solutions using Postgres, Redis, ClickHouse, and DynamoDB.
- Optimize database performance through schema design, indexing strategies, and resource partitioning.
- Implement best practices for infrastructure security, performance monitoring, and data integrity.
- Establish and maintain CI/CD pipelines for automated testing, deployment, and monitoring.
- Provide mentorship to junior engineers, conduct code reviews, and promote best practices in software development.
- Proactively identify and resolve performance bottlenecks and technical challenges in both streaming and database systems.
Technical Expertise
- Backend Development: 8 years of experience with Java in large-scale, distributed environments.
- Kafka Mastery: Extensive experience with Apache Kafka, including Kafka Streams, Kafka Connect, partitioning, replication, and consumer group management.
- Cloud Infrastructure: Strong experience with AWS services (e.g., MSK, EC2, RDS, DynamoDB, S3, Lambda).
- Distributed Systems: Solid understanding of distributed system design, messaging patterns, and eventual consistency.
- Performance Optimization: Proven ability to diagnose and resolve bottlenecks in streaming and database systems.
- Experience integrating Kafka with analytics solutions like ClickHouse.
- Knowledge of event-driven architecture and streaming patterns like CQRS and event sourcing.
- Hands-on experience with monitoring tools (e.g., Prometheus, Grafana, Kafka Manager).
- Experience automating infrastructure with tools like Terraform or CloudFormation.
- Proficiency with Postgres, Redis, ClickHouse, and DynamoDB. Experience with data modeling, query optimization, and high-transaction databases.
- Familiarity with encryption, role-based access control, and secure API development.
To apply, send your resumé to us via email at eng-careers@oscilar.com.