What are the responsibilities and job description for the Cloud Data Developer / Engineer position at Steneral Consulting?
Title: Cloud Data Developer / Engineer
Location: Woodcliff Lake, NJ 07677 (Onsite)
Duration: 12 months Contract
Pay rate: $70/hr on w2 ( We dollars flexibility
Visa: USC and GC
At-a-Glance
Are you ready to build your career by joining a global pharmaceutical company global financial company? If so, our client is hiring a Cloud Data Developer / Engineer!
What You’ll Do
Technical Skills Required:
Location: Woodcliff Lake, NJ 07677 (Onsite)
Duration: 12 months Contract
Pay rate: $70/hr on w2 ( We dollars flexibility
Visa: USC and GC
At-a-Glance
Are you ready to build your career by joining a global pharmaceutical company global financial company? If so, our client is hiring a Cloud Data Developer / Engineer!
What You’ll Do
- Work on a team building data lake(s), warehouse(s), pipelines, and machine learning models which will be utilized to drive drug development through predictive modeling of disease and drug response.
- Collaborate closely with biostatisticians in statistics methodology and machine learning to support projects in various stages of development in multiple business groups within Client. The Cloud Data Developer / Engineer will be part of providing actionable insights while creating mission critical data science projects for our business.
Technical Skills Required:
- Experience with AWS and all services required for providing big data and machine learning solutions.
- Should be familiar with big data platforms like AWS EMR, AWS Glue, and Databricks.
- Must be familiar with cataloging of data in big data scenarios such as data lakes utilizing tools like Hive or Glue Catalog.
- Must be familiar with creating data pipelines for big data ingestion.
- Experience with Python with PySpark, Scala, R, or SQL (Scala is a nice to have but not required).
- Ability to work with imaging files like DICOMs and how to extract metadata from those images and catalog the data.
- Experience with data cleansing and processing SAS datasets.
- Experience with sophisticated deep learning data science tools like Keras, PyTorch, and TensorFlow.
- Advanced data analysis.
- Data storytelling.
- Visual analytics with tools like Tableau, Spotfire, or QuickSight.
- Must have deep expertise in ETL including the creation of destination files like ORC and Parquet.
- SAS experience preferred, but not required.
- Proven track record of developing, deploying, and supporting data analytic tools.
- Experience developing front-end interface to statistical models with tools like R/Shiny or others.
- Experience managing and coordinating with IT teams to maintain secure and compliant tools and applications.
- Experience with developing and deploying cloud-based tools or distributed computing environment using Spark.
- Excellent communication and presentation skills required.
- Experience in managing different workstreams and coordinating tasks with internal teams and outside consultants.
- Years of experience: 8-10 years minimally.
- Bachelor’s Degree required with Master’s Degree preferred or equivalent industry experience.
Salary : $70