What are the responsibilities and job description for the Stream Infra Engineer position at Photon?
Greetings from Photon!!
Who are we?
Photon has emerged as one of the world’s largest and fastest-growing Digital Agencies. We work with 40% of the Fortune 100 on their Digital initiatives and are known for our ability to integrate Strategy Consulting, Creative Design, and Technology at scale. For a brief 1 minute video about us, you can check https://youtu.be/uJWBWQZEA6o.
Position: Streaming Infrastructure Engineer
Location: Mountain View, CA
Job Summary:
We are seeking an experienced Senior Engineer, Analytics with a focus on Streaming to join our team. As a Senior Engineer, you will play a key role in designing, building, and maintaining our real-time analytics infrastructure. You will work closely with our data scientists, product managers, and other engineers to develop and deploy scalable, efficient, and reliable data pipelines.
Responsibilities:
Design, build, and maintain large-scale streaming data pipelines using technologies such as Apache Beam, Apache Kafka, Apache Kinesis, Google Cloud Pub/Sub, and Google Cloud DataFlow
Develop and implement streaming data processing jobs using programming languages like Java or Python. (Experience with both is a plus)
Work with data scientists and product managers to develop and deploy real-time analytics applications
Collaborate with other engineers to integrate streaming data pipelines with our data warehouse and data lake
Work with public cloud providers such as Google Cloud Platform (GCP), Amazon Web Services (AWS), or Microsoft Azure
Monitor and troubleshoot streaming pipelines to ensure high availability and performance.
Implement DevOps principles and practices to ensure efficient and reliable deployment of batch processing systems
Knowledge of containerization technologies such as Docker and Kubernetes
Requirements:
Bachelor's degree in Computer Science or a related field
5 years of experience in software engineering, with a focus on streaming data pipelines and analytics
Strong programming skills in Java or Python
Experience with a public cloud provider, with a focus on GCP
Strong experience with messaging/stream processing systems such as Apache Kafka, Apache Kinesis, Google Cloud Pub/Sub, and Google Cloud DataFlow
Experience with data warehousing and data lake technologies
Strong understanding of data modeling, data governance, and data security
Excellent problem-solving skills, with the ability to work independently and collaboratively
Excellent communication skills, with the ability to explain complex technical concepts to non-technical stakeholders
Experience with machine learning and data science technologies is a plus
Certification in a public cloud provider or a relevant technology is a plus