What are the responsibilities and job description for the Senior Data Engineer / Databricks / Snowflake position at Motion Recruitment?
We are currently working with a company that is a leader in data driven strategies for the world's leading Pharmaceutical, Biotech, and Medical Technology companies, with a focus on providing innovative solutions that are leading the charge and staying up to date in an industry that can be known for being behind on technology. They are providing integrated solutions that combine their deep domain knowledge and advanced analytics with cutting edge technologies to address the critical challenges of the industry.
They are looking for a Data Engineer to join their team that has a strong knowledge of Software Development principles in Python. The main tools that you need to be strong in are Python, Databricks for data processing, and Snowflake for data warehousing.
This role will be fully remote but working mainly in Eastern Time Zone hours as well as being a Full Time, Direct Hire role.
Required Skills & Experience
They are looking for a Data Engineer to join their team that has a strong knowledge of Software Development principles in Python. The main tools that you need to be strong in are Python, Databricks for data processing, and Snowflake for data warehousing.
This role will be fully remote but working mainly in Eastern Time Zone hours as well as being a Full Time, Direct Hire role.
Required Skills & Experience
- 7 years of experience in data engineering or a related field with a focus on software development.
- Proficiency in Python with strong experience in building scalable applications and data workflows.
- Hands-on experience with Databricks for large-scale data processing and Snowflake for cloud-based data warehousing.
- Solid understanding of SQL and database optimization techniques.
- Experience with cloud platforms such as AWS, Azure, or GCP, including data services.
- Knowledge of modern software development practices, including version control, CI/CD, and automated testing.
- Strong problem-solving skills and a passion for building efficient, maintainable solutions.
- Bachelor’s degree in Computer Science or a related field, or equivalent professional experience.
- Familiarity with streaming technologies (e.g., Apache Kafka, Spark Streaming).
- Experience with infrastructure-as-code tools like Terraform or CloudFormation.
- Exposure to big data ecosystems and tools such as Hadoop, Spark, or Hive.
- Background in data visualization and reporting tools is a plus.
- Competitive salary and comprehensive benefits package.
- Opportunities for career growth and ongoing learning.
- A chance to work on cutting-edge projects at a leading technology company.
- Health, Dental, and Vision insurance.
- Generous Paid Time Off (PTO).
- Flexible working arrangements, including remote work options.