What are the responsibilities and job description for the Snowflake Cloud Architect position at Daman Consulting?
Job Details
We are seeking a Snowflake Cloud Architect with a minimum of 5 years of experience in Snowflake architecture, database design, data sharing, cost optimization, and API integration. This position requires an expert who is proficient in Snowflake s capabilities and best practices with a deep understanding of performance optimization, data modeling, and cloud data architecture principles. This role requires a highly skilled individual who can work independently under limited supervision while applying creativity and initiative to solve complex problems. The role requires strong problem-solving skills, a results-driven mindset, and a solid grasp of data governance and security practices in a cloud environment. The ability to collaborate with cross-functional teams and provide technical leadership is essential. This position will report to the Data Management Officer.
- Assess the current infrastructure and database designs in Snowflake and come up with an optimized approach for the long-term sustainability of the environment.
- Responsible for developing, optimizing, and overseeing the company s logical, conceptual, and physical data model and providing recommendations.
- Lead user requirements elicitation for end-to-end Data integration process using ETL for Structured, semi-structured, and Unstructured Data.
- Build robust data pipelines to ingest data into Snowflake, especially large datasets like Geometric files, GIS datasets, HEC-RAS models
- Develop Near real-time data loads from various sources to Snowflake databases.
- Proficient in Python, and Python libraries to assist businesses in developing machine learning and other scientific models using Snowflake, Streamlit, and Snowpark.
- Develop cost optimization techniques to keep costs in control and develop future estimates as per the projected workloads and storage needs.
- Contribute toward developing a comprehensive data cloud strategy for the agency, hybrid cloud infrastructure using Snowflake, AWS S3 and AWS RDS, AWS Kinesis, etc.,
- Develop data-sharing functionalities using Snowflake APIs or other API techniques or tools to move data in and out with external entities and the public in a secure way.
- Build artifacts to efficiently manage data science model life cycle, including development, testing, training, and deploying models in an efficient way and extended ad-hoc support.
- Develop training modules and provide training support for the staff as needed.