What are the responsibilities and job description for the Data Platform Architect (Snowflake) || Houston, TX || Contract position at Radiansys, Inc.?
Job Details
Position Title: Data Platform Architect (Snowflake)
Type of hire: 12 Months
Location - Houston, TX
The architect in this context is responsible for ensuring that Snowflake is utilized effectively to meet the needs of the business, leveraging its features for data warehousing, analytics, and data sharing.
Key Responsibilities:
1.Architecture Design:
- Design the overall data architecture and blueprint for Snowflake implementation.
- Define the structure of data, how it's ingested, stored, processed, and analyzed in Snowflake.
- Choose appropriate data models, such as star schema or snowflake schema, based on the organization's needs.
2.Data Integration:
- Integrate various data sources into Snowflake, including structured, semi-structured, and unstructured data.
- Work with ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes, ensuring efficient data pipelines into Snowflake.
- Use Snowflake s native capabilities, like Snowpipe, to automate data loading processes.
3.Performance Optimization:
- Monitor the performance of queries, storage, and compute resources.
- Use best practices to optimize query performance, including managing clustering, partitioning, and indexing.
- Optimize the use of Snowflake s multi-cluster architecture for scaling.
4.Security & Governance:
- Define security policies and ensure proper access control within Snowflake using roles, permissions, and encryption.
- Implement data governance practices to ensure data integrity, compliance, and lineage tracking.
- Design data sharing strategies and set up secure data sharing with partners or other organizations.
5.Cost Management:
- Monitor and manage Snowflake s compute and storage costs.
- Help balance the trade-off between performance and cost by selecting the right compute and storage configurations.
- Implement best practices to keep the costs of scaling, storing, and querying data within an acceptable range.
6.Collaboration with Data Engineers and Analysts:
- Collaborate with data engineers to build and maintain efficient data pipelines.
- Work closely with data analysts and data scientists to ensure that the platform is optimized for their reporting and analytical needs.
- Guide teams in leveraging Snowflake's features like time travel, zero-copy cloning, and data sharing for efficient analysis.
7.Upgrading and Migration:
- Lead Snowflake upgrades and version control to ensure compatibility and stability.
- Support migration of legacy data platforms or on-prem solutions to Snowflake in cloud environments.
8.Automation and Scripting:
- Develop automation scripts to streamline recurring tasks and operations like backups, monitoring, and alerting.
- Use SQL, Python, or Snowflake s own SnowSQL for automating tasks and queries.
Key Skills:
- Cloud expertise: Deep understanding of cloud environments, particularly AWS, Azure, or Google Cloud, since Snowflake runs on these platforms.
- Snowflake Knowledge: Strong hands-on experience with Snowflake architecture, optimization techniques, and security configurations.
- ETL/ELT tools: Familiarity with tools like Apache Airflow, dbt, Fivetran, or Talend.
- SQL proficiency: Advanced SQL skills for designing queries, optimizing performance, and ensuring data quality.
- Data Modelling: Ability to design efficient and scalable data models for analytics purposes.
- Security: Expertise in data security, user management, and data privacy regulations like GDPR, HIPAA.
Tools and Technologies:
- Snowflake Services (Snowpipe, Data Sharing, etc.)
- ETL/ELT Tools (Fivetran, Talend, Apache Airflow, dbt)
- Cloud Platforms (AWS, Microsoft Azure, Google Cloud Platform)
- BI Tools (Tableau, Power BI, Looker, etc.)
- Monitoring Tools (Datadog, Snowflake s Resource Monitor)
Regards,
Saurav Kathariya | RADIANSYS INC. US IT Recruiter Email: Cell NO.:
Address: 39510 Paseo Padre Pkwy, Suite 110 Fremont, CA 94538