What are the responsibilities and job description for the Big Data Expert for Databricks Solutions position at SWISH?
About Us
We're a company that values innovation and technical excellence.
Our core solutions focus on IT modernization, performance engineering, and cybersecurity.
We believe in empowering our engineers to drive business outcomes.
Job Description
We are seeking a Big Data Expert for Databricks Solutions to support the implementation and adoption of data products with Databricks.
The candidate should have strong foundational experience in Databricks architecture and deployment patterns.
In addition, the candidate should have experience recommending and implementing best practices for efficient solution development on top of the Databricks platform in both data science and data engineering domains.
The work is performed remotely during Eastern Standard Time core hours with the ability to occasionally support on-site in Alexandria, VA.
- Drive Databricks platform architecture development and optimize / fine-tune Databricks architecture to ensure high performance and scalability.
- Implement security measures and best practices to protect data processed and stored using Databricks.
- Support the team through the design, development, and deployment of Databricks solutions.
Requirements
- Bachelor's degree in Computer Science, Engineering, or a related field or equivalent years' experience.
- ~3 – 5 years of experience as a Databricks Architect/Engineer, with a strong understanding of Databricks architecture and deployment patterns.
- ~ Deep knowledge of data science and data engineering concepts and techniques.
- ~ Proficiency in programming languages such as Python, Scala, SQL, etc.
- ~ Experience with big data processing frameworks like Apache Spark.
- ~ Ability to work independently and in an integrated team-oriented environment (will be working with several integration teams such as ODA, Cloud Team, Power Apps, etc.).
- ~ Cloud AWS / Azure certification(s) related to data engineering, data warehousing, or data-lake is a plus.