What are the responsibilities and job description for the Big Data Design Engineer position at PTR Global?
Big Data Hadoop Administrator - Cloudera
100% Remote
1 year duration
Rate: Open to market on W2, suggest $75-80/hr w2.
Seeking an Experienced Hadoop Admin with an understanding of Cloudera.
Job Description:
The specific compensation for this position will be determined by a number of factors, including the scope, complexity and location of the role as well as the cost of labor in the market; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Our full-time consultants have access to benefits including medical, dental, vision and 401K contributions as well as any other PTO, sick leave, and other benefits mandated by appliable state or localities where you reside or work.
100% Remote
1 year duration
Rate: Open to market on W2, suggest $75-80/hr w2.
Seeking an Experienced Hadoop Admin with an understanding of Cloudera.
Job Description:
- At Client, the Big Data Design Engineer is responsible for architecture design, implementation of Big Data platform, Extract/Transform/Load (ETL), and analytic applications.
- Oversees implementation and ongoing administration of Hadoop infrastructure and systems
- Manages Big Data components/frameworks such as Hadoop, Spark, Storm, HBase, Hadoop Distributed File System (HDFS), Pig, Hive, Sqoop, Flume, Ozie, Avro, etc.
- Analyzes latest Big Data analytic technologies and innovative applications in both business intelligence analysis and new offerings
- Aligns with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and expand existing environments
- Handles cluster maintenance and creation/removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise
- Handles performance tuning of Hadoop clusters and Hadoop MapReduce routines
- Screens Hadoop cluster job performances and capacity planning
- Monitors Hadoop cluster connectivity and security
- Manages and reviews Hadoop log files
- Handles HDFS and file system management, maintenance, and monitoring
- Partners with infrastructure, network, database, application, and business intelligence teams to guarantee high data quality and availability
- Collaborates with application teams to install operating system and Hadoop updates, patches, and version upgrades when required
- Acts of point of contact for vendor escalation
- Bachelor's degree in a related field
- Seven (7) years of experience in architecture and implementation of large and highly complex projects
- Experience with Airflow, Argo, Luigi, or similar orchestration tool
- Experience with DevOps principals and CI/CD
- Experience with Docker and Kubernetes
- Experience with No-SQL databases such as HBase, Cassandra, or MongoDB
- Experience with streaming technologies such as Kafka, Flink, or Spark Streaming
- Experience working with Hadoop ecosystem building Data Assets at an enterprise scale
- Strong communication skills through written and oral presentations
- Experienced Hadoop Admin with an understanding of Cloudera.
The specific compensation for this position will be determined by a number of factors, including the scope, complexity and location of the role as well as the cost of labor in the market; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Our full-time consultants have access to benefits including medical, dental, vision and 401K contributions as well as any other PTO, sick leave, and other benefits mandated by appliable state or localities where you reside or work.
Salary : $70 - $80