What are the responsibilities and job description for the Snowflake Cloud Architect position at Neos Consulting?
Job Details
Neos is a leading Austin-based IT Staffing and Consulting firm, and was recognized as one of the in Austin.
No calls, no emails, please respond directly to the apply link with your resume and contact details.
Neos is seeking a Snowflake Cloud Architect for a long-term contract with our client in Austin, TX.
*****ONLY CANDIDATES LOCAL TO AUSTIN, TEXAS NEED APPLY******
- DESCRIPTION OF SERVICES
Neos Consulting team is seeking a Snowflake Cloud Architect with a minimum of 5 years of experience in Snowflake architecture, database design, data sharing, cost optimization and API integration. This position requires an expert who is proficient in Snowflake s capabilities and best practices with a deep understanding of performance optimization, data modeling, and cloud data architecture principles. This role requires a highly skilled individual who can work independently under limited supervision while applying creativity and initiative to solve complex problems. The role requires strong problem-solving skills, a results-driven mindset, and a solid grasp of data governance and security practices in a cloud environment. The ability to collaborate with cross-functional teams and provide technical leadership is essential. This position will report to the Data Management Officer.
- Assess the current infrastructure and database designs in Snowflake and come up with an optimized approach for long term sustainability of the environment.
- Responsible to develop, optimize, and oversee the company s logical, conceptual, and physical data model and provide recommendations.
- Lead user requirements elicitation for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data.
- Build robust data pipelines to ingest data into Snowflake, especially large datasets like Geometric files, GIS datasets, HEC-RAS models
- Develop Near real time data loads from various sources to Snowflake databases.
- Proficient in Python, Python libraries to assist business developing machine learning and other scientific models using Snowflake, Streamlit and Snowpark.
- Develop cost optimization techniques to keep costs in control and develop future estimates as per the projected workloads and storage needs.
- Contribute toward developing a comprehensive data cloud strategy for the agency, hybrid cloud infrastructure using Snowflake, AWS S3 and AWS RDS, AWS Kinesis etc.,
- Develop data sharing functionalities using Snowflake APIs or other API techniques or tools to move data in and out with external entities and public in a secure way.
- Build artifacts to efficiently manage data science model life cycle, to include development, testing, training, and deploying models in an efficient way and extended ad-hoc support.
- Develop training modules and provide training support for the staff as needed.
- CANDIDATE SKILLS AND QUALIFICATIONS
Minimum Requirements:
Candidates that do not meet or exceed the minimum stated requirements (skills/experience) will be displayed to customers but may not be chosen for this opportunity.
Minimum Requirements: | ||
Years | Required/Preferred | Experience |
5 | Required | years of experience in Snowflake architecture, database design, data sharing, cost optimization and API (Required) |
5 | Required | years of experience in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them. |
5 | Required | years of experience in handling semi-structured data (JSON, XML), columnar PARQUET using the VARIANT attribute in Snowflake. |
5 | Required | years of experience in re-clustering data in Snowflake with good understanding on Micro-Partitions. |
5 | Required | years of experience in migration processes to Snowflake from on-premises database environment. |
5 | Required | years of experience in designing and building auto ingestion data pipeline with near real time data. |
5 | Required | years of experience working in Cloud technologies such as AWS S3, SQS, EC2, Lambda, Redshift, RDS |
5 | Required | years of experience designing and developing automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with Snow SQL. |
5 | Required | years of experience with SnowSQL developing stored procedures and writing queries to analyze and transform data. |