What are the responsibilities and job description for the Sr Data Engineer position at Kaizen Analytix?
Job Description
We are seeking a highly skilled Senior Data Engineer to support our project team in the development, enhancement, and productionalization of a Dynamic Pricing Solution.
This role will focus on turning an existing functional prototype to a “production quality” solution. This involves designing, developing, and optimizing a scalable data architecture, implementing best practices in SDLC and CI/CD, and facilitating the migration of the solution to AWS and Databricks.
The ideal candidate will possess extensive experience in data engineering, data modeling, cloud platforms, and big data technologies, with a strong background in software development and automation. Knowledge and experience with Revenue Management and Pricing solutions or other machine learning solutions is preferred.
Key Responsibilities
We are seeking a highly skilled Senior Data Engineer to support our project team in the development, enhancement, and productionalization of a Dynamic Pricing Solution.
This role will focus on turning an existing functional prototype to a “production quality” solution. This involves designing, developing, and optimizing a scalable data architecture, implementing best practices in SDLC and CI/CD, and facilitating the migration of the solution to AWS and Databricks.
The ideal candidate will possess extensive experience in data engineering, data modeling, cloud platforms, and big data technologies, with a strong background in software development and automation. Knowledge and experience with Revenue Management and Pricing solutions or other machine learning solutions is preferred.
Key Responsibilities
- System Development & Architecture
- Provide broad support for the creation of a scalable, production-grade system and data model.
- Gather and document requirements for the Dynamic Pricing Solution in collaboration with the project team.
- Support design and RMS solution architecture for the Dynamic Pricing Solution.
- Develop and refine the solution to ensure it aligns with production standards.
- Development
- Support the migration of the existing solution to AWS and Databricks.
- Refactor code to improve modularity, performance, and scalability.
- Conduct regression testing to ensure system stability and accuracy.
- Ensure security and reliability of the data in the warehouse.
- Automation & Best Practices
- Automate data feeds and pipelines to improve efficiency and reliability.
- Implement SDLC and CI/CD best practices in alignment with Project guidelines.
- Ensure compliance with data governance, security, and operational standards.
- Education & Experience:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
- 5 years of experience in data engineering, big data technologies, and cloud-based platforms.
- Experience with Dynamic Pricing Solutions or Revenue Management Systems (RMS).
- Knowledge of machine learning pipelines and MLOps.
- Technical Skills:
- Strong programming skills in SQL, Python, and PySpark
- Deep expertise in cloud technologies: AWS, Databricks, Snowflake.
- Experience with Github
- Hands-on experience with ETL pipelines, data modeling, and automation.
- Proficiency in CI/CD, DevOps, and SDLC best practices.
- Soft Skills:
- Strong communication and collaboration abilities to work cross-functionally with IT and business teams.
- Excellent problem-solving and analytical skills.
- Ability to manage multiple tasks and work in a fast-paced environment.