What are the responsibilities and job description for the Data Engineer - Cloud Infrastructure position at Prequel Solutions?
Job Overview:
Prequel Solutions is a leading provider of innovative technology solutions. We are currently seeking a highly skilled Data Engineer to join our team and contribute to the design, build, and maintenance of our cloud data environment.
The ideal candidate will have a strong background in data engineering, cloud computing, and software development. They will work closely with our data science team to develop and maintain large-scale data processing systems, optimize database performance, and ensure data security and compliance.
This is an excellent opportunity for a talented engineer to join our team and take their career to the next level. If you are passionate about working with big data and cloud technologies, we encourage you to apply for this position.
Key Responsibilities:
Prequel Solutions is a leading provider of innovative technology solutions. We are currently seeking a highly skilled Data Engineer to join our team and contribute to the design, build, and maintenance of our cloud data environment.
The ideal candidate will have a strong background in data engineering, cloud computing, and software development. They will work closely with our data science team to develop and maintain large-scale data processing systems, optimize database performance, and ensure data security and compliance.
This is an excellent opportunity for a talented engineer to join our team and take their career to the next level. If you are passionate about working with big data and cloud technologies, we encourage you to apply for this position.
Key Responsibilities:
- Design and implement data pipelines using Databricks on cloud platforms.
- Develop and manage data models to support analytics, reporting, and operational use cases.
- Collaborate with data scientists and business stakeholders to deliver high-impact solutions.
- Automate data ingestion, transformation, and loading processes (ETL/ELT) using modern orchestration tools.
- Monitor pipeline performance and troubleshoot data-related issues proactively.