What are the responsibilities and job description for the Data Engineer position at Synergy Interactive?
Data Engineer
Hybrid
1-2 days onsite per week in NYC
$180-200K 10-15% (negotiable)
We are seeking a highly motivated and skilled Data Engineer to join our team and play a critical role in the development and maintenance of our Market and Credit Counterpart Risk Platform. In this role, you will be responsible for designing, building, and optimizing data pipelines to support critical risk calculations and reporting. You will work closely with business stakeholders to understand their requirements and translate them into robust and scalable data solutions.
Key Responsibilities:
- Design, develop, and maintain data pipelines using Python and AWS services (e.g., S3, EMR, Glue, Redshift/Snowflake).
- Implement and optimize data ingestion, transformation, and loading processes.
- Develop and maintain data quality checks and monitoring systems.
- Work with business stakeholders to understand their data needs and translate them into technical requirements.
- Collaborate with other engineers and data scientists to build and improve our risk platform.
- Participate in all phases of the software development lifecycle, including design, development, testing, and deployment.
- Investigate and resolve data quality issues and performance bottlenecks.
- Stay up-to-date with the latest data engineering technologies and best practices.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- years of experience as a Data Engineer.
- Strong proficiency in Python.
- Experience with AWS services (e.g., S3, EMR, Glue, Redshift/Snowflake).
- Experience with a data orchestration tool (e.g., Airflow, Apache NiFi).
- Experience with data warehousing and data modeling concepts.
- Experience with Agile development methodologies.
- Excellent communication and collaboration skills.
- Strong problem-solving and analytical skills.
Preferred Qualifications:
- Experience with Market and Credit Counterpart Risk.
- Experience with large-scale, global data platforms.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Experience with containerization technologies (e.g., Docker, Kubernetes).
- Experience with data quality and lineage tools.
Salary : $180,000 - $200,000