What are the responsibilities and job description for the MLOps Engineer position at LatentView Analytics?
LatentView Analytics is a leading global analytics and decision sciences provider, delivering solutions that help companies drive digital transformation and use data to gain a competitive advantage. With analytics solutions that provide a 360-degree view of the digital consumer, fuel machine learning capabilities and support artificial intelligence initiatives., LatentView Analytics enables leading global brands to predict new revenue streams, anticipate product trends and popularity, improve customer retention rates, optimize investment decisions, and turn unstructured data into valuable business assets.
We are looking for a Databricks & MLOps Engineer with expertise in machine learning operations (MLOps), model lifecycle management, and cloud-based data platforms. The ideal candidate will have hands-on experience in Databricks, MLflow, CI / CD, and orchestration tools, and should be comfortable working in any cloud environment (Azure, AWS, or GCP).
Key Responsibilities :
Databricks ML Platform Development :
Design and implement scalable ML pipelines in Databricks using MLflow, Delta Lake, and Feature Store.
Optimize ML model training, versioning, and deployment using Databricks Jobs and Workflows.
Build reusable notebooks and libraries for model training, testing, and inference.
MLOps & Model Deployment :
Implement CI / CD pipelines for ML models using Databricks Repos, GitHub Actions, Jenkins, or Azure DevOps.
Automate model deployment using MLflow Model Registry, REST APIs, or Databricks Model Serving.
Monitor model drift, performance, and retraining needs.
Cloud & Infrastructure Management :
Deploy ML solutions on Azure (Databricks, AKS), AWS (SageMaker, EMR), or GCP (Vertex AI, GKE).
Set up containerized ML workloads using Docker and Kubernetes.
Manage security, IAM roles, and access policies across environments.
Orchestration & Data Pipelines :
Migrate ML workflows from Airflow, Cloud Composer, or Step Functions to Databricks Jobs.
Integrate with data engineering pipelines built on Delta Lake & Spark.
Monitoring & Observability :
Track data and model lineage using Unity Catalog & MLflow.
Automate alerts for failures, performance degradation, and cost monitoring.
Skills Required :
- Communication and leadership experience, with experience initiating and driving projects.
- Experience with Databricks, MLflow, Vertex AI / Sagemaker.
- Experience in SQL or similar languages.
- Development experience in at least one object-oriented language (Python, etc.).
- BA / BS in Computer Science, Math, Physics, or other technical fields.
At LatentView Analytics, we value a diverse, inclusive workforce and we provide equal employment opportunities for all applicants and employees. All qualified applicants for employment will be considered without regard to an individual’s race, color, sex, gender identity, gender expression, religion, age, national origin or ancestry, citizenship, physical or mental disability, medical condition, family care status, marital status, domestic partner status, sexual orientation, genetic information, military or veteran status, or any other basis protected by federal, state or local laws.