What are the responsibilities and job description for the Hadoop Engineer position at Montek System?
Job Details
Hadoop Engineer
Duration- 12 Months
Location- Phoenix, AZ or Charlotte NC
Hybrid- 3 days per week
Experience Required
- 10 Years of overall experience
Roles & Responsibilities
- Work experience on Cloudera Data Platform (CDP) and good knowledge on data migration from Hortonworks (on-prem)/MapR to CDP (public/private) Cloud
- Sound knowledge in CDP and Bigdata technologies (Hortonworks)/MapR and Good to have enough knowledge in Scala.
- Advantage who has knowledge in Pyspark Framework.
- Good script writing experience in Ansible, Shell, Unix.
- Extensive experience in securing the cluster/storage/encryption.
- Provide support to team members and helping them to understand the projects and requirements and guiding them to create the optimized solution of it.
- team player and proven track record of working in various team sizes performing cross-functional roles.
Technical/Functional Skills
Primary Skills:
- Cloudera Data Platform (CDP), CDS, Kubernetes & Docker services
- Big Data: Hadoop
- Spark, Scala, Pyspark, HDFS
- Scripting Languages: Batch Script, Shell Script, Python, Ansible & terraform.
- Ansible, EPL, Shell scripting, openshift.
Additional Skills:
- Microsoft Stack: MS-SQL with strong knowledge in RDBMS concepts
- Agile, Scrum, Jira, Git, SVN, Liquibase, ServiceNow
- Airflow, Autosys, Visio.
Database Skills:
- Postgres, MySQL, Teradata, Oracle.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.