What are the responsibilities and job description for the Data Scientist position at Lawrence Ash?
Job Details
Location: Bay Area , CA or Reno, NV. No remote (Weekly 5 days to Office) Duration: 12 months
Start Date : Immediate Worked at a Trucking/Logistics/Freight companies: Uber Freight, Con-way, CH Robinson, Coyote Logistics, Flexport are some examples
JD Key Responsibilities:
Analyzing data patterns and trends through statistical methods to understand the underlying relationships within the data Deep understanding of statistical modeling, machine learning algorithms, analytics concepts, and a track record of solving problems
5 years of experience with developing ML models
Prior experience in ML modeling using TensorFlow, Pytorch
Model Development: Develop and deploy machine learning algorithms & models using Python and relevant libraries/frameworks
Data Preprocessing, Address missing data, outliers, and anomalies using Databricks robust data processing tools
Feature Engineering Development
Training Validation, Model Experimentation and Model Selection Hyper parameter tuning
Presenting complex data insights through compelling visualizations like charts, graphs, and dashboards to effectively communicate findings to stakeholders
Interpreting data insights and translating them into actionable business recommendations Minimum Requirements: Education: Bachelor s degree in computer science, Statistics, Mathematics, or related field.
Experience:
Minimum of 3 years of experience in data science - centric roles with a strong focus on analytics, data science, and machine learning. Experience as Data Scientist in Freight / Logistics Industry is must.
Technical Skills:
Proficiency in Python, SQL, or PySpark with strong programming skills Strong understanding of machine learning algorithms, model evaluation, and optimization techniques
Proven track record of developing and deploying models in Production Analytical Skills: Robust analytical and problem-solving skills with a solid foundation in statistical methods. Expertise in translating business issues into data-driven solutions using Databricks.