What are the responsibilities and job description for the DevOps Engineer-Hybrid position at Aries Solutions Intl Inc?
DevOps Engineer-Hybrid
-Azure Cloud, AKS – Scalability, monitoring, deployment, check logs, ensure node and pod health.
-Databases include - Cassandra, Mongo, PostGres, NoSQL
-Databricks Notebooks – experience with Databricks to know how a notebook is created and run - run queries against the database and finding discrepancies and perform fixes.
-Experience with using Kafka, Event Hub, NATS or any messaging broker.
-JAVA Based microservices, responsible for deployment, scripting language is python. Should have an understanding around terraform.
-Emphasis on Logs and Monitoring (datadog and splunk)
We are seeking an experienced, self-motivated Senior Engineer who is technically very strong with strong Linux background, with deep knowledge in micro services, backend storage design, NoSQL database, distributed systems and very good troubleshooting skills.
Typical activities include production monitoring, creating monitoring dashboards, setting up alerts, triaging alerts coupled with the ability to drive efforts and solution improvements effectively across various IT and business functions.
In this role, person will be responsible for setting up monitoring dashboards, alerts, maintaining production systems, deploying code in Production, monitoring alerts, resolving issues, and leading production troubleshooting calls.
Working with Product Owners and other developers to implement highly scalable reactive application platform solutions in Cloud based Linux environments.
Skills Reqd
Requires 10 years experience in the IT industry
Requires 10 years of software and DevOps development engineering
Experience in working with cloud environment Azure preferred.
Experience with Kubernetes, Azure Kubernetes (AKS) preferred.
Experience with using Kafka, Event Hub, NATS or any messaging broker.
Experience with Cassandra, PostgresSQL, Mongo, Elastic Search, Cosmos DB
Experience on Azure DevOps, Jenkins/ Python / Terraform / Ansible
Experience with Databricks
Experience with DataDog, Splunk or other logging and APM tools.
Experience in working with Linux environment.
Experience building complex, scalable, high-performance software systems that have been successfully delivered to customers
Demonstrated knowledge of best practices for the design and implementation of large-scale systems as well as experience in taking such systems from design to production
Experience building and operating mission critical, highly available (24x7) systems