What are the responsibilities and job description for the Big Data Architect position at EROS Technologies Inc.?
Company Description
EROS Technologies was founded with a simple motive of offering the clients exactly what they want, how they want and when they want it. By leveraging for its clients its technological edge and right-sourcing advantage, EROS in a short period of time has grown to become one of the most trusted strategic technology partners. Treating every client as the top priority, we customize our solutions and services to align with the unique needs of each client.
Job Description
FYIP,
Requirement ID: 3249 (Please reference this Req ID when you send us resumes)
Role: Big Data Architect
Location: NYC, NY
Start Date: As Earliest
Experience: Above 10 Years
Rate Range: Based on the experience
Contract Duration: 6 months – 1 Year (could be extended based on performance)
Number of Open Positions: 2
Responsibilities (But not limited to):
• Development of proposals for implementation and design of scalable big data solutions.
• Participation in customer’s workshops and presentation of the proposed solution.
• Design/ architected and implemented complex projects dealing with the considerable data size (GB/ PB) and with high complexity
• Provide deployment solutions based on customer needs with Sound knowledge about the clustered deployment architecture.
• Able to guide / partner with VP / Directors for solutions around Bigdata
• Design, implement, and deploy high-performance, custom applications at scale on Hadoop.
• Review and audit of existing solution, design and system architecture.
• Perform profiling, troubleshooting of existing solutions.
• Create technical documentation.
Desired Skills and Experience:
• Lead with at least 2 years of implementation experience in any one of the Hadoop distribution with its ecosystem.
• Experience with big data technologies and frameworks including but not limited to Hadoop, MapReduce, Pig, Hive, HBase, Oozie, Mahout, Flume, ZooKeeper, MongoDB, and Cassandra.
• Knowledge on Cluster sizing and Infrastructure planning.
• Experience in Mongodb development is a must and should have hands on experience in integrating RDBMS systems with Hadoop and mongodb using Stream processing with well known tools like storm, kafka
• Expertise on Java/ J2EE and should still be hands on.
• Having Working Knowledge on Sqoop and Flume for Data Processing.
• Implemented and in-depth knowledge of various Java / J2EE / EAI patterns by using Open Source products.
• Should have worked on Open Source products and also contributed towards it.
Please reach out to me if you need any further information.
Regards,
Sunita Jha
Talent Acquisition Executive
408-872-4112
Additional Information
All your information will be kept confidential according to EEO guidelines.