What are the responsibilities and job description for the Big Data Design Engineer position at Ryan Consulting Group, LLC?
Note: All candidates must be able to work as a W2 or 1099 employee for any employer in the US.
(The role is not eligible for those requiring sponsorships now or potentially in the future.)
Title: Big Data Design Engineer
Type of role: Contract (12mo assignment)
Location: Fully remote, no travel, candidate is expected to work CST hours
Compensation: $65/hr – 85/hr (Based on relevant experience.)
Job Description: The Big Data Design Engineer is responsible for architecture design, implementation of Big Data platform, Extract/Transform/Load (ETL), and analytic applications.
Primary Responsibilities
- Oversees implementation and ongoing administration of Hadoop infrastructure and systems. Manages Big Data components/frameworks such as Hadoop, Spark, Storm, HBase, Hadoop Distributed File System (HDFS), Pig, Hive, Sqoop, Flume, Ozie, Avro, etc.
- Analyzes latest Big Data analytic technologies and innovative applications in both business intelligence analysis and new offerings.
- Aligns with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and expand existing environments.
- Handles cluster maintenance and creation/removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise.
- Handles performance tuning of Hadoop clusters and Hadoop MapReduce routines Screens. Hadoop cluster job performances and capacity planning.
- Monitors Hadoop cluster connectivity and security.
- Manages and reviews Hadoop log files.
- Handles HDFS and file system management, maintenance, and monitoring.
- Partners with infrastructure, network, database, application, and business intelligence teams to guarantee high data quality and availability.
- Collaborates with application teams to install operating system and Hadoop updates, patches,
and version upgrades when required Acts of point of contact for vendor escalation.
This position is exempt from timekeeping requirements under the Fair Labor Standards Act and is not
eligible for overtime pay.
Requirements
- Bachelor's degree in a related field
- Seven (7) years of experience in architecture and implementation of large and highly complex projects
- Must have Cloudera Admin skills (ex. understanding error msgs., configuring and set-up of clusters etc.)
- Must have experience maintaining/navigating Cloudera in Hadoop
- Must have good communication skills and experience working collaboratively within a highly technical support environment.
- Experience with YARN
- Knowledge of HIVE and Impala
Skills and Competencies
Experience with Airflow, Argo, Luigi, or similar orchestration tool Experience with DevOps principals and CI/CD Experience with Docker and Kubernetes Experience with No-SQL databases such as HBase, Cassandra, or MongoDB Experience with streaming technologies such as Kafka, Flink, or Spark Streaming Experience working with Hadoop ecosystem building Data Assets at an enterprise scale Strong communication skills through written and oral presentations
Salary : $65 - $85