What are the responsibilities and job description for the Big Data with Financial Services position at Career Guidant?
Company Description
Career Guidant, an internationally acclimed, trusted multi-faced orgiansation into Information Technology Custom Learning Services for Enterprises, Lateral Staffing Solutions, Information Technology Development & Consulting, Infrastructure & Facility Management Services and Technical Content development as core competencies. Our experienced professionals bring a wealth of industry knowledge to each client and operate in a manner that produces superior quality and outstanding results.
Career Guidant proven and tested methodologies ensures client satisfaction being the primary objective. Committed to our core values of Client Satisfaction, Professionalism, Teamwork, Respect, and Integrity.
Career Guidant with its large network of delivery centres,support offices and Partners across India, Asia Pacific, Middle East, Far East, Europe, USA has committed to render the best service to the client closely to ensure their operation continues to run smoothly. Our Mission
"To build Customer satisfaction, and strive to provide complete Information Technology solution you need to stay ahead of your competition" If you have any queries about our services.
Job Description
• Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture.
• Must have strong programming knowledge of Core Java or Scala - Objects & Classes, Data Types, Arrays and String Operations, Operators, Control Flow Statements, Inheritance and Interfaces, Exception Handling, Serialization, Collections, Reading and Writing Files.
• Must have hands on experience in design, implementation, and build of applications or solutions using Core Java/Scala.
• Strong understanding of Hadoop fundamentals.
• Must have experience working on Big Data Processing Frameworks and Tools – MapReduce, YARN, Hive, Pig.
• Strong understanding of RDBMS concepts and must have good knowledge of writing SQL and interacting with RDBMS and NoSQL database - HBase programmatically.
• Strong understanding of File Formats – Parquet, Hadoop File formats.
• Proficient with application build and continuous integration tools – Maven, SBT, Jenkins, SVN, Git.
• Experience in working on Agile and Rally tool is a plus.
• Strong understanding and hands-on programming/scripting experience skills – UNIX shell, Python, Perl, and JavaScript.
• Should have worked on large data sets and experience with performance tuning and troubleshooting.
Preferred
• Knowledge of Java Beans, Annotations, Logging (log4j), and Generics is a plus.
• Knowledge of Design Patterns - Java and/or GOF is a plus.
• Knowledge of Spark, Spark Streaming, Spark SQL, and Kafka is a plus.
• Experience to Financial domain is preferred
• Experience and desire to work in a Global delivery environment
Qualifications
• Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education
• At least 5 years of Design and development experience in Big data, Java or Datawarehousing related technologies
• Atleast 3 years of hands on design and development experience on Big data related technologies – PIG, Hive, Mapreduce, HDFS, HBase, Hive, YARN, SPARK, Oozie, Java and shell scripting
• Should be a strong communicator and be able to work independently with minimum involvement from client SMEs
• Should be able to work in team in diverse/ multiple stakeholder environment
Additional Information
Note : NO OPT, H1 for this position
Client : Infosys