What are the responsibilities and job description for the Data Architect (Snowflake) position at Triveni IT?
Job Summary
We are seeking a highly skilled Data Architect with extensive expertise in Snowflake for a part-time, short-term contract role. This position will focus on designing, building, and managing scalable data architectures for a specific project based in the Philadelphia area. The ideal candidate will have strong hands-on experience with Snowflake and be able to contribute to building innovative and high-performance data solutions. Please note that the first week of this role will require on-site presence, followed by a remote work setup.
Key Responsibilities
- Design and implement robust, scalable, and secure data architecture using Snowflake's cloud-based platform for a short-term project.
- Develop and optimize data models (conceptual, logical, and physical) to support project-specific analytics and reporting needs.
- Create and optimize ETL/ELT pipelines for ingesting, transforming, and loading data into Snowflake.
- Implement performance-tuning strategies to ensure efficient data processing, query performance, and cost optimization within Snowflake.
- Collaborate with project teams to integrate Snowflake with other data platforms, tools, and systems (e.g., AWS, Azure, GCP, Tableau, Power BI).
- Define and implement data governance, security policies, and compliance frameworks for data stored in Snowflake.
- Work closely with internal teams and stakeholders to ensure that data solutions meet project requirements.
- Oversee Snowflake accounts, monitor workloads, and troubleshoot any data-related issues that arise.
Required Skills and Qualifications
- Proven experience as a Data Architect or in a similar role, with hands-on expertise in Snowflake.
- Strong knowledge of cloud-based platforms (e.g., AWS, Azure, GCP).
- Expertise in SQL and database management systems.
- Proficiency in data modeling, including dimensional and normalized models.
- Experience with ETL/ELT tools (e.g., Talend, Matillion, dbt, Informatica).
- Knowledge of data governance, security protocols, and compliance standards.
- Familiarity with BI tools (e.g., Tableau, Power BI, Looker) and their integration with Snowflake.
- Strong problem-solving and communication skills.
- Preferred Qualifications
- Snowflake SnowPro Certification or equivalent is a plus.
- Experience working with big data technologies (e.g., Spark, Hadoop).
- Hands-on experience with Python, Scala, or other programming languages.
- Familiarity with machine learning workflows and data pipelines.