What are the responsibilities and job description for the Principal Data Engineer position at OneTrust?
The Challenge
Principal Data Engineers will be part of an innovative data team that enables Marketing, Product, Sales, and Finance to explore data and take actions that differentiate us from our competition. You will work closely with other team members like data architects and business analysts to understand what the business is trying to achieve, move data from source to target, and design optimal data models.
Your Mission
You will work closely with other team members like data architects and business analysts to understand what the business is trying to achieve, move data from source to target, and design optimal data models.
Design and build facts, dimensions, snapshots, SCDs in Snowflake using DBT/Airflow/ELT tools
Design Enterprise Data Models for ease of data access, accuracy, and scale
Develop ELT and implement development best practices applicable for very large and wide data sets
Work effectively using scrum with multiple team members to deliver analytical solutions.
Solve immediate production issues with an eye to solve things for long-term
Drive technical conversations with stakeholders to explore all facets of problem and solution
You Are
This hands-on technical role demands excellent knowledge and can demonstrate best practices in the industry.
Your Experience Includes
Bachelor’s Degree or Master’s Degree in Computer Science, Engineering or related field
8 years of experience with very large-scale data warehouse projects
Experience working in high-growth, fast-paced environment
BS in Computer Science or equivalent is required
Experience working in high-growth, fast-paced environment
Prior experience working with Snowflake/Redshift, AWS, S3
Experience in building ETL pipelines using DBT
Prior experience in designing data models with Salesforce, Workday, Marketo, and Product Usage logs
Experience in designing Data Warehouse: Product Usage, Deferred Revenue, Revenue Metrics
Very comfortable in designing facts, dimensions, snapshots, SCDs
Strong Datawarehouse architecture knowledge and ability to write complex SQL for processing raw data, data validation and QA
Strong Experience with Python and manipulation of various data formats for extraction and transformation
Experience working with data orchestration tool Apache Airflow
Experience with Python and manipulation of various data formats for extraction and transformation