What are the responsibilities and job description for the Data Engineer with Google Cloud Platform position at JS Consulting Solutions?
Job Details
Job Title-Jr Data Engineer with Google Cloud Platform
Project Location Atlanta/Alpharetta GA
Duration- 6 months
Contract / CTH / FullTime- Contract
Remote or Hybrid or Onsite Remote
Visa- USC-EAD/-EAD
Jr Data Engineer with Google Cloud Platform
Fully remote position - however resources need to reside in GA, local to Atlanta/Alpharetta if possible.
2 rounds of interviews. 1 with two members of team, the 2nd will be with Hiring Manager.
3 years of professional experience as a data engineer
3 years working with Python and SQL.
Experience with state-of-the-art machine learning algorithms such as deep neural networks, support vector machines, boosting algorithms, random forest etc. preferred
Experience conducting advanced feature engineering and data dimension reduction in Big Data environment is preferred
Strong SQL skills in Big Data environment (Hive/ Impala etc.) a plus
Things that would stand out on resume -
1- Master's degree in computer science & data science
2- Previous Company - Any Bank, Ecommerce
Qualifications: 3 years of professional data engineering or data wrangling experience in:-
- working with Hadoop based or Cloud based big data management environment
- bash scripting or similar experience for data movement and ETL
- Big data queries in Hive/Impala/Pig/BigQuery (Sufficient in BigQuery API libraries to data prep automation is a plus)
- Advanced Python programming (Scala is a plus) with strong coding experience and Proficient in data studio, Big Table, GitHub working experience (Cloud composer and Data flow is a plus)
- basic Google Cloud Platform certification is a plus
- Knowledge of Kubernetes is a plus (or other types of Google Cloud Platform native tools of the container-orchestration system for automating computer application deployment, scaling, and management)
- Basic knowledge in machine learning (ensemble machine learning models, unsupervised machine learning models) with experience using Tensorflow and PyTorch is a plus
- Basic knowledge in graph mining and graph data model is a plus
What You'll Do
Build automated ML/AI modules, job, and data preparation pipelines by gathering data from multiple sources and systems, integrating, consolidating and cleansing data, and structuring data and analytical procedures for use by our clients in our solutions.
Perform design, creation, and interpretation of large and highly complex datasets
Consult with internal and external clients to understand the business requirements so successfully build datasets and implement complex big data solutions (under senior lead's supervision).
Ability to work with Technology and D&A teams to review, understand and interpret the business requirements to design and build missing functionalities to support the identity and fraud analytics needs (under senior lead's supervision).
Ability to work on the end to end interpretation , design, creation, and build of large and highly complex analytics related capabilities (under senior lead's supervision).
Strong oral and written communication skills, and ability to collaborate with cross-functional partners