What are the responsibilities and job description for the BigData Consultant position at Jobsbridge?
Company Description
Hello,
Greetings from Jobsbridge!
Jobsbridge, Inc. is a fast growing Silicon Valley based I.T staffing and professional services company specializing in Web, Cloud & Mobility staffing solutions.
Be it core Java, full-stack Java, Web/UI designers, Big Data or Cloud or Mobility developers/architects, we have them all.
Job Description
Responsible for BigData (1.3) / data warehouse Maintenance, Clean up, monitoring with heavy loads. Acts as a lead in identification and troubleshooting processing issues impacting timely availability of data in the data warehouse or delivery of critical reporting within established SLAs. Identifies and recommends improvements in production application solutions or operational processes in support of data warehouse applications and business intelligence reporting (ie, data quality, performance, static and dynamic reports, etc.) Focuses on the overall stability and availability of the BigData applications and the associated interfaces, data transport protocols Researches, manages and coordinates resolution of complex issues through root cause analysis as appropriate. Ensure adherence to established problem / incident management, change management and other internal IT processes Ensures 3rd party vendors engaged in projects deliver on their responsibilities and conform to Gap’s established standards Ensures comprehensive knowledge transition from development teams on new or modified applications moving to ongoing production support Seeks improvement opportunities in design, solution implementation approaches in partnership with Architects and Operations t team for ensuring the performance and health of the BigData and other EDW Applications Participates in production migrations and upgrades, develops processes for sustaining the BigData, EDW & BI environment and ensures implementation of the same Ensures timely and accurate escalation of issues to management Technical Skills 7 years of technical experience in Enterprise Data Warehouse and Business Intelligence environments 3 years developing and supporting data integration in Hadoop (1.3) with more than 10 clusters is must. 2-3 years developing or supporting applications using Talend, Pig, Hive, Python, Spark, Hbase and Pig. Advanced experience with real-time data ingestion (Kafka,MQ etc) and ingestion using Talend into Hortonworks or Cloudera environment is a plus. Possess a strong technology, data, and bigdata/data warehouse application design background Experience with various databases, Teradata, HBase is preferred. Extensive experience with production batch scheduling and monitoring (CAWA, Oozie etc)
Qualifications
Hadoop (1.3),Pig, Hive, Python, Spark,Kafka,MQ,CAWA, Oozie.
Additional Information
Only GC/Citizen,OPT,EAD,H4