What are the responsibilities and job description for the Big Data /Hadoop consultant position at Menzo Technologies?
Job Description
Big Data /Hadoop
Start Date: ASAP
Location: Chevy Chase, MD
Duration: 12--24 Month Contract
Mode of Interview: Phone and Skype
Job Description :
· 3 years of hands-on experience in the Hadoop ecosystem (HDFS, YARN, MapReduce, Oozie, AND Hive)
· 1 year of hands-on experience in Spark core AND Spark SQL
· 5 years of hands-on programming experience in either core Java OR Spark
· 3 years of hands-on experience in Data Warehousing AND Data Marts AND Data/Dimensional Modeling AND ETL
· 1 years of hands-on experience in HBase OR Cassandra OR any other NoSQL DB
· Understanding of Distributed computing design patterns AND algorithms AND data structures AND security protocols
Desired Skills
· Understanding of Kafka AND Spark Streaming
· Experience in any one of the ETL tools such as Talend, Kettle, Informatica OR Ab Initio
· Exposure to Hadoop OR NoSQL performance optimization and benchmarking using tools such as HiBench OR YCSB
· Experience in performance monitoring tools such as Ganglia OR Nagios OR Splunk OR DynaTrace
· Experience on continuous build and test process using tools such as Maven AND Jenkins
· Certification in HortonWorks OR Cloudera preferred BUT NOT MANDATORY
Additional Information
All your information will be kept confidential according to EEO guidelines.