Demo

SAP HANA & Snowflake Data Engineer :: Long Term Contract :: Portland, OR - Onsite.

GlobalPoint Inc
Portland, OR Contractor
POSTED ON 1/10/2025
AVAILABLE BEFORE 3/4/2025
Role: SAP HANA & Snowflake Data Engineer

Duration: Long Term Contract

Work Location & Type: Portland, OR - Onsite

NOTE: No sponsorship is available for this role and we need local candidates.

Job Description

We are looking for a versatile SAP HANA & Snowflake Data Engineer with 5-8 years of overall IT software development experience. This role demands a balance of expertise in SAP HANA Development and Snowflake Data Engineering, with a strong passion for delivering high-quality, performance-driven solutions. The ideal candidate will have in-depth knowledge of both SAP HANA architecture and Snowflake, using their technical skill set to deliver cutting-edge solutions for our data and analytics needs.

Primary Responsibilities

SAP HANA Development (50% of Role):

  • HANA Ecosystem Development:
    • Build and optimize data solutions utilizing the native SAP HANA Ecosystem and components.
    • Architect and develop high-performance SAP HANA solutions, focusing on data models, views, table functions, and procedures that efficiently process large data volumes.
    • Leverage the latest features of HANA v2.0, applying advanced data warehousing and analytics techniques to meet complex business requirements.
  • Performance Optimization:
    • Conduct in-depth performance analysis and optimization of SAP HANA models, ensuring maximum efficiency and scalability.
    • Utilize Explain/Viz Plan Analysis to diagnose and resolve performance bottlenecks.
  • Technologies & Tools:
    • Work with tools like SAP HANA Studio, WebIDE, Smart Data Integration (SDI), Smart Data Quality (SDQ), and BODS to build robust HANA solutions.
    • Implement advanced HANA features, such as XS Classic and Advanced Development, HANA Rules Framework, and Agile Data Prep.
    • Integrate with UI5 and SAP Analytics Cloud to create rich user experiences and dashboards.
Snowflake Data Engineering (50% of Role):

  • Data Pipeline Development:
    • Design, develop, and maintain scalable data pipelines in Snowflake, enabling efficient data ingestion, transformation, and storage. This includes building pipelines for structured, semi-structured, and unstructured data.
    • Leverage DBT (Data Build Tool) to define and execute complex data transformations within Snowflake. This includes creating modular SQL scripts for transforming raw data into clean, usable datasets that align with business needs. Experience in defining reusable DBT models to enforce data standards and improve productivity is crucial.
    • Develop, manage, and optimize ETL/ELT workflows using Matillion. This includes configuring Matillion jobs to handle data extractions, loading, and transformations while ensuring pipeline reliability, scalability, and performance. You will be expected to create complex workflows, manage dependencies, and schedule jobs using Matillion’s orchestration features.
  • Data Warehousing & Performance Optimization:
    • Design and implement optimized data warehousing solutions within Snowflake by leveraging multi-cluster warehouses, data partitioning, and clustering keys. You will be responsible for ensuring that data retrieval and processing are optimized for both performance and cost efficiency.
    • Use DBT to structure data transformations efficiently, building incremental models to handle large datasets. You will apply performance tuning techniques, such as query optimization, caching, and materialized views, to ensure the Snowflake environment remains performant as data scales.
    • Monitor and troubleshoot Matillion workflows to ensure that ETL/ELT processes are executing correctly and efficiently, making necessary adjustments to improve performance.
  • Collaboration & Best Practices:
    • Collaborate with data architects, analysts, and other stakeholders to design data models and pipelines that align with business goals and adhere to best practices. You will help implement data governance policies, ensuring that data pipelines are secure, reliable, and compliant.
    • Apply version control practices in both DBT and Matillion, ensuring that development efforts are properly documented, reversible, and aligned with team workflows.
    • Advocate for and contribute to the continuous improvement of the data engineering process, including driving efforts to implement CI/CD pipelines for Snowflake development using DBT and Matillion.
Required Skills & Qualifications

  • Technical Expertise:
    • 5-8 years of hands-on experience in SAP HANA development and Snowflake Data Engineering.
    • Proven ability to design, develop, and optimize data solutions in both SAP HANA and Snowflake environments.
    • Strong proficiency in DBT (Data Build Tool) and Matillion, with experience in building modular and scalable data transformation pipelines.
    • In-depth knowledge of Snowflake’s architectural features, including multi-cluster warehouses, query performance tuning, and data modeling.
  • Technologies & Tools:
    • SAP HANA: Hands-on experience with Web IDE, BODS, SQL/Script, and oData/Rest Web Services. Familiarity with UI5, SAP Analytics Cloud, and advanced HANA modules such as PAL/APL Libraries, Data Masking, and XS Advanced is a plus.
    • Snowflake: Expertise in DBT, Matillion, SQL, and other ETL tools used for data transformation and pipeline development. Strong knowledge of Snowflake performance optimization techniques is required.
  • Other Qualifications:
    • Bachelor’s degree in Computer Science, Information Technology, or related field, or equivalent experience.
    • Strong communication and collaboration skills, with the ability to work effectively across teams.
    • Passion for learning and implementing new technologies.

If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

What is the career path for a SAP HANA & Snowflake Data Engineer :: Long Term Contract :: Portland, OR - Onsite.?

Sign up to receive alerts about other jobs on the SAP HANA & Snowflake Data Engineer :: Long Term Contract :: Portland, OR - Onsite. career path by checking the boxes next to the positions that interest you.
Income Estimation: 
$84,222 - $112,497
Income Estimation: 
$115,390 - $147,559
Income Estimation: 
$106,780 - $140,358
Income Estimation: 
$104,963 - $131,876
Income Estimation: 
$71,122 - $96,652
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
Income Estimation: 
$168,522 - $211,152
Income Estimation: 
$189,259 - $248,928
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at GlobalPoint Inc

GlobalPoint Inc
Hired Organization Address Portland, OR Contractor
Role: BSA Solution Architect - Snowflake Work Type & Location: Portland, OR 97217 - Onsite Duration: Long Term Contract ...
GlobalPoint Inc
Hired Organization Address Washington, DC Contractor
Role: Learning Management System (LMS) Administrator Duration: 12 Months Contract Work Type & Location: Hybrid - Washing...
GlobalPoint Inc
Hired Organization Address Jersey, NJ Full Time
Job Details Role: Sr. Engineer with SaberTalk Duration: Long Term Contract Work Type: Remote Must have experience: Saber...
GlobalPoint Inc
Hired Organization Address Denver, CO Full Time
Title: Project Manager with-ADA accessibility experience. Work Type: Denver, CO-Hybrid-80203 Duration: Long-Term Note: C...

Not the job you're looking for? Here are some other SAP HANA & Snowflake Data Engineer :: Long Term Contract :: Portland, OR - Onsite. jobs in the Portland, OR area that may be a better fit.

AI Assistant is available now!

Feel free to start your new journey!