What are the responsibilities and job description for the Big Data Engineer (InPerson Interview at NY/AZ) position at Analytics Solutions?
Job Details
Note: For Interview purpose you have to go onsite (In Person Interview) at the New York/Phoenix, Arizona location and the client will not pay any travel expenses.
Role: Big Data Engineer (Passport Number Must)
Location: New York & Phoenix, Arizona (In Person Interview & Onsite Work)
Skills: Big Data; ETL - Big Data / Data Warehousing; Google Cloud Platform; Java
Contract C2C Job
Job Description:
We are looking for a highly skilled Engineer with a solid experience of building Big Data, Google Cloud Platform Cloud based real time data pipelines and REST APIs with Java frameworks.
The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization s data-driven initiatives.
This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders.
This role will be focused on the delivery of innovative solutions to satisfy the needs of our business.
As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team.
Technical Skills
- Core Data Engineering Skills Proficiency in using Google Cloud Platform s big data tools like: Big Query: For data warehousing and SQL analytics.
Dataproc: For running Spark and Hadoop clusters. Google Cloud Platform Dataflow: For stream and batch data processing. (High level Idea) Google Cloud Platform Pub/Sub: For real-time messaging and event ingestion. (High level Idea) Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions.
- Programming and Scripting Strong coding skills in SQL, and Java. Familiarity with APIs and SDKs for Google Cloud Platform services to build custom data solutions.
- Cloud Infrastructure Understanding of Google Cloud Platform services such as Cloud Storage, Compute Engine, and Cloud Functions. Familiarity with Kubernetes (GKE)and containerization for deploying data pipelines. (Optional but Good to have)
- DevOps and CI/CD Experience setting up CI/CD pipelines using Cloud Build, GitHub Actions, or other tools. Monitoring and logging tools like Cloud Monitoring and Cloud Logging For production workflows.
- Backend Development (Spring Boot & Java)-> Design and develop RESTful APIs and microservices using Spring Boot.-> Implement business logic, security, authentication (JWT/OAuth), and database operations.-> Work with relational databases (MySQL, PostgreSQL, MongoDB, Cloud SQL).-> Optimize backend performance, scalability, and maintain ability.-> Implement unit testing and integration testing.
Soft Skills
- Innovation and Problem-Solving Ability to think creatively and design innovative solutions for complex data challenges.
Experience in prototyping and experimenting with cutting-edge Google Cloud Platform tools or third-party integrations. Strong analytical mindset to transform raw data into actionable insights.
- Collaboration Teamwork: Ability to collaborate effectively with data analysts, and business stakeholders.
Communication: Strong verbal and written communication skills to explain technical concepts to non-technical audiences.
- Adaptability and Continuous Learning Open to exploring new Google Cloud Platform features and rapidly adapting to changes