What are the responsibilities and job description for the Big Data Architect and Engineer position at Prequel Solutions?
About the Role:
We are seeking a highly skilled Data Engineer to join our team at Prequel Solutions. The successful candidate will be responsible for designing, building, and maintaining our cloud data environment, including data pipelines, data models, and data governance policies.
The ideal candidate will have a strong background in data engineering, cloud computing, and software development. They will work closely with our data science team to develop and maintain large-scale data processing systems, optimize database performance, and ensure data security and compliance.
This is an excellent opportunity for a talented engineer to join our team and take their career to the next level. If you are passionate about working with big data and cloud technologies, we encourage you to apply for this position.
Responsibilities:
We are seeking a highly skilled Data Engineer to join our team at Prequel Solutions. The successful candidate will be responsible for designing, building, and maintaining our cloud data environment, including data pipelines, data models, and data governance policies.
The ideal candidate will have a strong background in data engineering, cloud computing, and software development. They will work closely with our data science team to develop and maintain large-scale data processing systems, optimize database performance, and ensure data security and compliance.
This is an excellent opportunity for a talented engineer to join our team and take their career to the next level. If you are passionate about working with big data and cloud technologies, we encourage you to apply for this position.
Responsibilities:
- Design and implement data pipelines using Databricks on cloud platforms.
- Develop and manage data models to support analytics, reporting, and operational use cases.
- Collaborate with data scientists and business stakeholders to deliver high-impact solutions.
- Automate data ingestion, transformation, and loading processes (ETL/ELT) using modern orchestration tools.
- Monitor pipeline performance and troubleshoot data-related issues proactively.