What are the responsibilities and job description for the Snowflake Data Engineer position at SRS?
Job Details
SRS Distribution, a wholly owned subsidiary of The Home Depot, currently operates under a family of distinct local brands encompassing more than 760 locations across 47 states. For more information, visit .
Where you'll work:
- This is an onsite position. Our corporate office is located
What You'll do:
We are seeking a Senior Data Engineer with the unique combination of business acumen, technical skills, and vision to translate innovative ideas into world-class technical solutions using modern technologies.
In this role, you will have an opportunity to be part of one of the fastest-growing organizations and take part in shaping industry-leading initiatives.
The Senior Data Engineer is responsible for building enterprise data engineering solutions using our cloud-based data platform (Snowflake). This is a hands-on role and will entail day-to-day technical deliverables and participation in design, delivery, and support for data engineering workloads in a fast-paced enterprise environment.
Business Engagement:
- This position requires a combination of business acumen to interface directly with key stakeholders to understand the problem and the skills to deliver data to support their needs.
- Works closely with business teams daily to meet ad hoc data requests.
- Develop a solid understanding of various business domains and develop /enhance existing data solutions that meet business needs.
Data Pipelines Development:
- Create automated pipelines to ingest and process structured and unstructured data from source systems into analytical platforms using batch and streaming mechanisms leveraging cloud-native toolsets.
- Experience leading data modeling conversations and developing logical data models for an ecosystem of large datasets of structured, semi-structured, and unstructured data formats.
- This role will require extensive experience in all SDLC phases and the ability to lead solution discussions with cross-functional teams for complex data projects.
- Expertise in the delivery, management, and business adoption of large datasets across a variety of data platforms, sources, and formats
- Develop, enhance, and provide recommendations for end-to-end monitoring capabilities of cloud data platforms.
- Automate the deployment of services and integrate new patterns that will allow us to scale as we ingest new data sets and deploy new capabilities.
- Lead business requirement gathering discussions and create technical specifications.
- Build in-depth knowledge across subject areas and capture documentation for end-to-end data flows/lineage.
- Serve as a key resource and technical SME for guiding the maintenance and future progression of the data warehouse and its architectural integrity.
Ways of Working and Collaboration:
- Self-starter mindset. Ability to take an idea and then build a use case for creating data integrations.
- Strong communication and management skills to interface with business and technology teams
- Participate in educating and cross-training other team members.
- Participate in daily stand-up calls and provide clear visibility to work products
- Support our analyst & data science community by curating & joining data sets, flattening hierarchies, and general data wrangling to assist in delivering analytic use cases.
- Create documentation to support the transition to production engineering delivery once the use cases are proven.
- On-Call support for Production applications
- Staying informed on the latest product features and modern / emerging data technologies and methodologies
What we look for:
- BS in Computer Science or a related field
- 6 years of experience in data engineering and analytics.
- 3 years of experience working on Data Processing Frameworks and Tools. Exposure to software engineering such as parallel data processing, data flows, REST APIs, JSON, XML, and microservice architectures
- 1 year of experience working on Big Data Processing Frameworks and Tools
- Proven experience in Snowflake (Certification SNOW Pro Certification - Preferred)
- Experience using Matillion and ELT methodology.
- Prior experience in SQL Server (SSIS / SSMS) would be Plus.
- Experience in at least one cloud platform and specifically leveraging Cloud native Data services (AWS, Azure, Google)
- Hands-on experience working with code repositories, CI/CD tools, and Agile methodology.
- Experience supporting Production applications or workloads in a cloud-based environment.
Nice to Haves:
- Good understanding and experience in any Messaging platforms (Kafka, Kinesis, Azure Event Hub, Streaming platforms using Pub Sub, etc.)
- Prior working experience on data science workbench Knowledge of machine learning pipelines (e.g., train/test splitting, scoring process, etc.)
- Background in Accounting or Pricing would be a plus.
Salary : $140,000