What are the responsibilities and job description for the Enterprise Solution Architect position at Celebal Technologies America, INC.?
Job Details
About the Role:
As a Solutions Architect (SA) within Celebal Technologies team, you will engage with clients to address their big data challenges using the Databricks platform. You will be responsible for designing and delivering data engineering, data science, and cloud technology projects that integrate client systems, ensuring customers maximize the value from their data. Your role will also include providing training and handling technical tasks related to project completion. This is a billable position, and you will need to deliver excellent customer service while working closely with the regional Manager/Lead to meet project specifications.
Key Responsibilities:
- Lead various impactful technical projects for customers, including designing reference architectures, developing technical "how-to" guides, and productionalizing customer use cases.
- Work with engagement managers to scope and define the professional services work based on customer input and business needs.
- Guide strategic customers through transformational big data and AI projects, as well as migrations from 3rd party platforms, delivering full end-to-end design, build, and deployment.
- Provide expert consulting on architecture and design; help bootstrap or implement projects, driving the customer's understanding, evaluation, and adoption of Databricks.
- Deliver escalated support for operational issues faced by customers.
- Collaborate with Databricks' Technical, Project Manager, Architect, and Customer teams to ensure technical components of projects meet client requirements.
- Work closely with Engineering and Databricks Customer Support to provide feedback on product and resolve any product or support issues that arise during engagements.
Qualifications:
- 10 years of experience in data engineering, data platforms, and analytics.
- Proficiency in programming languages such as Python or Scala.
- Strong working knowledge of cloud ecosystems (AWS, Azure, Google Cloud Platform) with deep expertise.
- Extensive experience with distributed computing using Apache Spark and a solid understanding of Spark runtime internals.
- Familiarity with CI/CD practices for production deployments.
- Working knowledge of MLOps methodologies.
- Proven experience in designing and deploying performant, end-to-end data architectures.
- Experience managing technical project delivery, including scope and timelines.
- Excellent documentation, whiteboarding, and communication skills.
- Proven ability to handle client engagements and resolve conflicts.
- Experience developing technical skills to support the deployment and integration of Databricks-based solutions.
- Bachelor s degree in Computer Science, Information Systems, Engineering, or equivalent work experience.