What are the responsibilities and job description for the Senior Data Engineer (Java + Python for Data Analytics)-Onsite @ NYC-Only Locals Needed position at Cyber Sphere LLC?
Job Details
Hi,
Hope you are doing well.
This is Srikar a Senior Technical Recruiter. I have a position for Senior Data Engineer (Java Python for Data Analytics), please review the job details below and let me know if you're interested.
Title Senior Data Engineer (Java Python for Data Analytics) Location- Onsite @ NYC-Only Locals Duration Long term
Job Description:
Role Description
Java and Python with at least 10 years experience and have engineering experience with Snowflake, databrick, or similar platforms
Senior Data Engineer (Java Python for Data Analytics)
Job Summary:
We are looking for a Senior Data Engineer with expertise in Java and Python for data analytics, data engineering, and large-scale data processing.
The ideal candidate will develop, optimize, and maintain data pipelines, ETL/ELT processes, and analytics workflows using Snowflake, , or similar platforms.
Key Responsibilities:
Develop scalable data processing applications using Java and Python.
Design, build, and optimize ETL/ELT pipelines to support data analytics and reporting.
Work with Snowflake, Databricks, or similar data platforms to process and transform large datasets.
Implement real-time and batch data processing solutions.
Collaborate with data scientists, analysts, and software engineers to enable advanced analytics and AI-driven solutions.
Optimize database queries, ensure data quality, and improve data pipeline efficiency.
Deploy, monitor, and troubleshoot data workflows in a cloud-based environment (AWS, Azure, or Google Cloud Platform).
Ensure compliance with data security, governance, and best practices in data management.
Required Skills & Qualifications:
10 years of experience in software development and data engineering.
Strong programming skills in Java and Python for data processing and analytics.
Hands-on experience with Snowflake, Databricks, or similar cloud data platforms.
Expertise in SQL, ETL/ELT pipelines, and distributed data processing.
Experience with big data frameworks like Apache Spark, Hadoop, or Kafka.
Strong understanding of data modeling, data warehousing, and performance tuning.
Familiarity with cloud platforms (AWS, Azure, Google Cloud Platform) and CI/CD pipelines.
Excellent problem-solving skills and ability to work in a cross-functional team.