What are the responsibilities and job description for the Snowflake Data Architect position at Dutech?
· Design overall data structure, ensuring that Snowflake's features (e.g., data sharing, scalability, secure data exchange, etc.) are fully utilized to meet the business requirements.
· Create a blueprint for how data will be stored, processed, and accessed within the Snowflake platform.
· Perform optimization of data pipelines and workflows for performance, scalability, and cost-efficiency.
· Design ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes, and optimize queries and data storage strategies.
· Integrate with other cloud services (e.g., AWS, Azure, GCP), third-party tools, and on-premises data systems.
· Designs and implements strategies to control access to sensitive data, applying encryption, role-based access control, and data masking as necessary.
· Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand their requirements and ensure the Snowflake environment meets those needs.
· Monitor the performance of the Snowflake environment, identifying bottlenecks, and ensuring optimal query performance.
· Automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting.
· Preform data loading methods (bulk loading using COPY INTO, Snowpipe for real-time ingestion, and External tables).
· Perform Snowflake cloning capabilities for databases and schemas.
· Perform configuration and management of Snowflake Virtual Warehouses including scaling, resizing and auto-suspend/resume settings.
· Implement roles and privileges for managing secure access utilizing Snowflake RBAC (Role-Based Access Control)
· Integrate Snowflake SSO (Single Sign-On) and SCIM (System for Cross-domain Identity Management) for secure access and identity management.
· Configure alerts and monitor data pipeline failures, resource spikes, and cost thresholds.
II. CANDIDATE SKILLS AND QUALIFICATIONS
Minimum Requirements:
Candidates that do not meet or exceed the minimum stated requirements (skills/experience) will be displayed to customers but may not be chosen for this opportunity.
Years
Required/Preferred
Experience
8
Required
Experience with data modeling, data integration, data warehousing, data governance, and data security
8
Required
Experience with Oracle and/or PostgreSQL in HA deployments and Expertise in data storage
8
Required
Proficiency in Snowflake architecture and its components.
8
Required
Hands-on experience with Snowflake objects such as Databases, Procedures, Tasks, and Streams.
8
Required
Expertise in using Snowflake’s cloning capabilities for databases and schemas.
8
Required
Proven experience in managing Snowflake Warehouses and optimizing performance for efficient query execution.
8
Required
Proficiency in Snowflake RBAC (Role-Based Access Control), including implementation of roles and privileges.
8
Required
Experience with integrating Snowflake SSO (Single Sign-On) and SCIM (System for Cross-domain Identity Management) for secure access and identity management.
8
Required
Experience working with data integration tools like Informatica and ADF for seamless ETL/ELT processes.
8
Required
Ability to automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting.
8
Required
Expertise in monitoring and troubleshooting Snowflake environments, including usage tracking and query profiling.
8
Required
Strong understanding of Snowflake’s security features such as data masking, encryption, and network policies.
8
Required
Technical writing and diagramming skills, including proficiency with modeling and mapping tools (e.g., Visio, Erwin), and the Microsoft Office Suite (Word, Excel, and PowerPoint) and MS Project.
8
Required
Experience on an agile sprint team
8
Required
Experience with JIRA software
8
Required
Experience working with multiple teams concurrently, being able to prioritize and complete work on time with high quality
8
Required
Knowledge of Informatica 10.5
8
Required
Developing reports in Cognos Analytics 11.1
5
Preferred
Familiarity with CI/CD pipelines and version control for managing Snowflake code deployments.
5
Preferred
Prior experience in the Healthcare Industry
5
Preferred
Prior experience with an HHS agency
5
Preferred
Prior experience working with PII or PHI data
5
Preferred
Prior experience working with HL7 data
5
Preferred
Prior experience with Azure
4
Preferred
Bachelor's degree in computer science, Information Systems, or Business or equivalent experience.
Job Types: Full-time, Contract
Pay: $70.00 - $75.00 per hour
Expected hours: 37.5 per week
Schedule:
- Day shift
- Monday to Friday
Application Question(s):
- Two professional references required for the Interview -
Experience:
- data modeling, data integration, data warehousing: 8 years (Preferred)
- with Oracle and/or PostgreSQL in HA deployments: 8 years (Preferred)
- Snowflake architecture and its components.: 8 years (Preferred)
- Snowflake objects such as Databases: 8 years (Preferred)
- Snowflake’s cloning capabilities for databases: 8 years (Preferred)
- Snowflake Warehouses and optimizing performance: 8 years (Preferred)
- Snowflake RBAC (Role-Based Access Control: 8 years (Preferred)
- integrating Snowflake SSO (Single Sign-On) and SCIM: 8 years (Preferred)
- Informatica and ADF for seamless ETL/ELT processes.: 8 years (Preferred)
- Snowflake SQL,scripting languages like Python or Shell scrip: 8 years (Preferred)
- troubleshooting Snowflake environments: 8 years (Preferred)
- Snowflake’s security features such as data masking: 8 years (Preferred)
- agile sprint team AND JIRA software: 8 years (Preferred)
Work Location: On the road
Salary : $70 - $75