What are the responsibilities and job description for the DBT/Snowflake/Azure Data Engineer w/ Retail Experience -Hybrid in Pleasanton, CA (ONSITE 3x a week) position at Cerebra Consulting Inc?
Job Details
Cerebra Consulting Inc is a System Integrator and IT Services Solution provider with a focus on Big Data, Business Analytics, Cloud Solutions, Amazon Web Services, Salesforce, Oracle EBS, Peoplesoft, Hyperion, Oracle Configurator, Oracle CPQ, Oracle PLM and Custom Application Development. Utilizing solid business experience, industry-specific expertise, and proven methodologies, we consistently deliver measurable results for our customers. Cerebra has partnered with leading enterprise software companies and cloud providers such as Oracle, Salesforce, Amazon and able to leverage these partner relationships to deliver high-quality, end-to-end customer solutions that are targeted to the needs of each customer.
Hello,
DBT/Snowflake/Azure Data Engineer w/ Retail Experience
Hybrid in Pleasanton, CA (ONSITE 3x a week)
3 month contract to hire
- Will be Onshore Lead for 2 Teams- Supply Chain and Merchandising
- 12- 15 years exp
- Architect or Senior Engineer but they will be doing development work
- Hands-on exp- not looking for a manager
- Cloud data warehousing
- Snowflake
- Azure
- Aws Google Cloud Platform is secondary- Snowflake is primary
- Python, SQL
- DBT Experience
- Previous retail (Prev Albertsons would be great!)
- Communication is key, speaking to stakeholders
- Will need to present to leadership
Communication skills
Everyone's resume says every cloud
Work with functionally split within retail analytics side
Person would be working on hub and spoke model- supply chain and merchandising
Two domains supply chains and store sites- lead engineers do review to other on team
GENERAL PURPOSE:
Briefly summarize the overall purpose of the position. This is a short explanation of the job's primary purpose and functions.
The Data Engineer III plays a critical role in engineering of data solutions that support Ross reporting and analytic needs. As a key member of the Data engineering team, will work on diverse data technologies such as Steramsets, dbt, data ops and others to build insightful, scalable, and robust data pipelines that feed our various analytics platforms.
ESSENTIAL FUNCTIONS:
- List the core duties or tasks that are fundamental to the performance of the job. This section provides detailed information about the job's tasks, duties, responsibilities, tool and equipment uses. Define the purpose, function and the result to be accomplished. Duties and responsibilities should be listed in order of their importance, occurrence, or time requirements.
- Design and Model data engineering pipelines that support Ross reporting and analytic needs.
- Engineer efficient, adaptable, and scalable data pipelines for moving data from different sources into our Cloud Lakehouse
- Understand and analyze business requirements and translate into well-architected solutions that demonstrate the modern BI & Analytics platform
- Be a part of data modernization projects providing direction on matters of overall design and technical direction, acts as the primary driver toward establishing guidelines and approaches
- Develop and deploy performance optimization methodologies
- Drive timely and proactive issue identification, escalation & resolution
- Collaborate effectively within Data Technology teams, Business Information teams to design and build optimized data flows from source to Data visualization
QUALIFICATIONS AND SPECIAL SKILLS REQUIRED:
List Education level, Years of Experience, Technical Knowledge, and/or Certifications required for the position.
- 12 years in-depth, data engineering experience and execution of data pipelines, data ops, scripting and SQL queries
- 5 years proven data architecture experience - must have demonstrable experience data architecture, accountable for data standards, designing data models for data warehousing and modern analytics use-cases (e.g., from operational data store to semantic models)
- At least 3 years experience in modern data architecture that support advanced analytics including Snowflake, Azure, etc. Experience with Snowflake and other Cloud Data Warehousing / Data Lake preferred
- Expert in engineering data pipelines using various data technologies ETL/ELT, big data technologies (Hive, Spark) on large-scale data sets demonstrated through years of experience
- 5 years hands on data warehouse design, development, and data modeling best practices for modern data architectures
- Highly proficient in at least one of these programming languages: Java, Python
- Experience with modern data modelling tools, data preparation tools
- Experience with adding data lineage, technical glossary from data pipelines to data catalog tools
- Highly proficient in Data analysis analyzing SQL, Python scripts, ETL/ELT transformation scripts
- Highly skilled in data orchestration with experience in tools like Ctrl-M, Apache Airflow. Hands on DevOps/Data Ops experience required
- Knowledge/working experience in reporting tools such as MicroStrategy, Power BI would be a plus
- Self-driven individual with the ability to work independently or as part of a project team
- Experience working in an Agile Environment preferred, Familiarity with Retail domain preferred
- Experience with Streamsets, dbt preferred
- Strong communication skills are required with the ability to give and receive information, explain complex information in simple terms and maintain a strong customer service approach to all users
- Bachelor's Degree in Computer Science, Information Systems, Engineering, Business Analytics, Business Management required
Thanks
Sai Revanth
revanth.patnala |
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.