What are the responsibilities and job description for the GCP Big Data Engineer position at PeopleLogic?
Big Data Engineer with mapreduce , spark , hive , sql skillset
"Expert in SQL and Data warehousing concepts.
Hands-on experience with public cloud data warehouse (GCP, Azure, AWS).
GCP certification will be very good to have.
MapR experience is must
Strong Hands on experience with one or more programming languages ( Python or Java).
Hands-on expertise with application design and software development in Big Data (Spark(Pyspark), HIVE).
Experience with CICD pipelines, Automated test frameworks, DevOps and source code management tools (XLR, Jenkins, Git, Maven).
Strong communication and analytical skills including effective presentation skills.
Familiarity with Agile & scrum ceremonies."