What are the responsibilities and job description for the Big Data/Hadoop Administrator position at Sonsoft Inc?
Company Description
Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services.
Job Description
Required:
Ø Must have at least 7 years of experience as Hadoop administrator.
Ø Must have experience building and administrating open source Hadoop environment or product without licensed distribution, warranty and support.
Ø Must have experience implementing and administrating Hadoop environment using distributions like Cloudera Express (preferred), Apache or Hortonworks.
Ø Must have good UNIX or Linux experience preferably Red Hat.
Ø Must have experience supporting High availability and Real-time streaming environment.
Ø General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
Ø Experience in Hadoop skills like MR, Hive, Pig, Spark, Kafka, Oozie, etc… is an advantage.
Ø Knowledge of Troubleshooting Core Java Applications is a plus.
Ø Good to have knowledge on Puppet/Chef/Ansible.
Ø Good to have knowledge or experience in supporting MySQL & NoSQL Databases like Cassandra & HBase.
Responsibilities:
Ø Responsible to build Hadoop Platform & Infrastructure from the scratch
Ø Aligning with development and architecture teams to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
Ø Responsible create security layer for Hadoop environment.
Ø Working with data delivery teams to setup new Hadoop users, including access to HDFS, Hive, Pig and MapReduce, etc.
Ø Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager Express or other open source tools.
Ø Screen Hadoop cluster job performances and capacity planning
Ø Monitor Hadoop cluster connectivity and security
Ø Manage and review Hadoop log files.
Ø File system management and monitoring.
Ø HDFS support and maintenance.
Ø Responsible to build a Hadoop Infrastructure that guarantees high data quality and availability.
Ø Collaborating with Sys Admins to install operating system and Hadoop updates, patches, version upgrades when required.
Ø Interact with the business users, Enterprise Architects and Technical Leads to gather the requirements.
Ø Experienced in building or administrating Hadoop cluster with HDFS, Spark, Kafka, Zookeeper, Impala, Hive, Yarn, Hue, Oozie, etc.
Ø Experienced in setting up high availability and disaster recovery Infrastructure across different data centers.
Ø Experienced in building or administrating a Real-time streaming environment.
Qualifications
Qualifications Basic:
• Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
Additional Information
** U.S. citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time.
Note:-
- This is a Full Time job oppurtunity.
- Only US Citizen, Green Card Holder, GC-EAD, H4-EAD, L2-EAD, TN VIsa can apply.
- No OPT-EAD and H1B for this position.
- Please mention your email id in your email or resume.