What are the responsibilities and job description for the Hadoop Developer--Full Time-- Charlotte, NC (Hybrid) position at Avtech Solutions?
Job Details
Job Title: Hadoop Developer
Location: Charlotte, NC (3 Days onsite in a Week)
Mode of employment: Full-Time
Required Skills & Experience
Hadoop/Cloudera 5 - 7 years. Responsible for translating the requirements created by functional analysts into the architecture for that solution and describing it through the set of architecture and design documents. Those documents are then used by the rest of the development team to implement the solution. The process of defining architecture by the Solutions Architect often involves selection of the most appropriate technology for the problem being solved, impact assessment as well as technical and operational feasibility. Also, responsible for developing the high-level strategy to solve business problems. Ensure solutions adhere to enterprise standards. May be aligned to an applications or a technology stack.
Desired Skill:
Customer focused mindset Strong interpersonal/influence skills. Ability to dissect complex issues and leverage/coordinate platform resources to resolve technical issues. Familiar developing within Hadoop platforms. Understanding of BI tools. ETL experience with emphasis on performance and scalability. Ability to handle multiple and simultaneous activities and priorities. ITSM/Remedy Change Management knowledge. Jira/Confluence Agile.
Exposure to the following technologies:
Spark/Impala
YARN
HIVE
Position Summary:
The successful candidate must have general knowledge of the Hadoop ecosystem including Cloudera, Hive, Impala, Spark, Oozie, Kafka, HBASE etc.
Learn new products/tools/technologies to guide business partners in implementing new Big Data features and functionality
Experienced in Agile development practices and product owner role
Platform capacity management, utilization, forecasting holistically from lower lanes to production for all tenants
Successfully candidate will join a team responsible for Hadoop production platform maintenance and support activities ensuring production platform stability and business continuity.
Provide critical Hadoop platform support including Incident management, change management and problem management. Develop platform monitoring tools requiring advanced skills developing custom diagnostic scripts and using common support tools such as Splunk, Tableau, Pepperdata, CM.