What are the responsibilities and job description for the Senior Big Data Developer position at 360 IT Professionals?
Company Description
We are focused on imparting effective business staffing services through high-level cost effective solutions. We have a strong foundation built on the legacy and emerging technologies, including an excellent track record of on-time delivery. We are leaders in providing additional custom IT Services with a proficient approach towards development of emerging mobile-based applications and web based application development. We are emerging as one of the largest private talent sourcing and management firms in the US.
Our client- one of the leading ICT for development - ICT4D - organization, providing low-cost solutions using ICT to tackle poverty and to overcome disadvantage, working closely with local communities seeks an accomplished IT Security Leader.
Job Description
Job role Senior Big Data Developer
Job Location: Deerfield, IL 60015
Project Duration: Long-term and Fulltime
VISA: H1B Transfer, Citizen, GC.
Salary: Base Salary Relocation expenses Benefits
Description:
Primary Responsibilities:
- Work on tasks assigned by Technical Architects related to a proof-of-concept in the area of big data.
- Many activities will involve implementation of a series of use cases for a given technology stack which includes: HDInsight, Azure Event Hub (or Kafka), Azure Data Factory (or Talend), HBASE, Hive, Pig, Sqoop and Flume.
- Recommend, design and code efficient and effective solutions for challenging problems for large work efforts of medium to high complexity.
- Collaborate with architects and developers and present alternative approaches to complex problems.
- Document techniques/approaches, process flow diagrams and data models as required.
Required Skills:
- Overall 5 years of experience with at least 2 years with big data.
- 3 years with Java, J2EE & databases like Oracle and/or SQL Server
- 2 years of experience in developing systems/applications with: Hadoop ecosystem including Hadoop (Hortonworks or Cloudera), Map Reduce Pig, Hive, Sqoop and Flume.
- 1 years of experience with Spark, Spark Streaming or Storm and Kafka.
- Experience with any ETL tool.
- Experience with troubleshooting performance issues, SQL tuning etc.
- Experience with shell scripting using Unix, PERL or Python
- Experience with NoSQL database technologies such as Cassandra or HBASE
Candidates preferred with experience in following platforms and technologies:
- Microsoft Azure
- Azure HDInsight or Hortonworks
- Azure Data Factory or Talend
- Azure Event Hub or Apache Kafka
- Cassandra and HBASE.
Additional Information
All your information will be kept confidential according to EEO guidelines.