What are the responsibilities and job description for the Hadoop Platform Engineer (Admin) || Dallas, TX Onsite || W2 Contract position at V2 Innovations Inc?
Job Details
Location: Dallas, TX (Onsite)
Job Type: Contract (W2 only)
Experience: 9 Years
We are seeking a skilled and proactive Hadoop Platform Engineer (Admin) to join our team in Dallas, TX. The ideal candidate will have a strong background in the Hadoop ecosystem, cluster administration, and distributed computing environments. You will be responsible for managing and optimizing big data infrastructure, ensuring high availability, scalability, and performance of Hadoop clusters in production settings. This is a great opportunity to work on large-scale data systems while collaborating with cross-functional engineering and data science teams in a fast-paced environment.
Key Responsibilities:-
Design, implement, configure, and maintain Hadoop clusters in production environments.
-
Monitor, tune, and optimize the performance of HDFS, YARN, MapReduce, Hive, Spark, and HBase components.
-
Administer and manage Cloudera Manager or Apache Ambari for cluster management and monitoring.
-
Automate cluster operations, backups, and health checks using Bash, Python, or other scripting languages.
-
Troubleshoot hardware and software issues impacting the data pipeline and ecosystem performance.
-
Manage and secure data ingestion pipelines using tools like Apache Kafka and Apache NiFi.
-
Ensure cluster security by implementing Kerberos, TLS, and role-based access control (RBAC).
-
Collaborate with DevOps, Data Engineering, and Application teams to support end-to-end data workflows.
-
Participate in capacity planning, software upgrades, and disaster recovery strategies.
-
Document all systems configurations, procedures, and ongoing changes.
-
Bachelor's degree in Computer Science, Information Technology, or a related technical discipline (or equivalent work experience).
-
9 years of experience in IT with 5 years in Hadoop platform administration.
-
Expertise in Hadoop ecosystem including HDFS, YARN, Hive, Spark, HBase, MapReduce.
-
Proficient in Linux/Unix system administration and shell scripting.
-
Hands-on experience with Cloudera or Hortonworks distributions.
-
Strong skills in scripting languages like Bash, Python, or Perl.
-
Familiarity with SQL and database management concepts.
-
Working experience with Apache Kafka, NiFi, and distributed data ingestion.
-
Understanding of data security practices and compliance frameworks.
-
Proven ability to diagnose complex issues across the big data stack.
-
Strong written and verbal communication skills with a collaborative mindset.
-
Cloudera Certified Administrator for Apache Hadoop (CCAH)
-
Hortonworks Certified Administrator (HCA)
-
Linux Certification (RHCE, LPIC, etc.)
-
Contract
-
W2 Only
-
No C2C / No 3rd Parties