What are the responsibilities and job description for the SENIOR DATA ENGINEER III position at Chewy?
Chewy is seeking a Sr.
- Data Engineer III
- in Boston or Minneapolis. This person will be part of the Customer Interaction Datastore team responsible for building a customer data platform in support of critical Enterprise Priorities. The ideal candidate will have an interest in building data pipelines, an eye for data quality, and potentially a curiosity about or experience building APIs. Additionally, the candidate will have a strong customer first, embody a curious and think big approach to their work to help further innovation within the team, and be an engaged and respectful team member.
- What Youll Do :
- Develop and maintain complex data ingestion pipelines and transformations for data originating from multiple data sources (structured / unstructured)
- Assist in crafting proof of concepts and advise, consult, mentor, and coach other data engineering and analytics professionals on data standards.
- Data cataloging and documentation of data sources
- Supervise data pipelines for accuracy, missing data, enhancements, changes, and billing volumes to ensure all data is assembled and processed accurately and when needed
- Build containerized applications with microservices architecture
- Reconcile data issues and alerts between various systems, finding opportunities to innovate and drive improvements.
- Work with multi-functional partners in defining and documenting requirements for building high-quality and impactful data products.
- Create operational reports using visualization / business intelligence tools.
- What Youll Need :
- 8 years of proven experience in Data Engineering or Business Analytics roles working with ETL, Data Modeling, and Data Architecture, developing modem data pipelines and applications
- Expertise crafting and implementing enterprise data pipelines using data engineering approaches and tools including but not limited to : Spark, PySpark, Scala, Docker, Databricks, Glue, cloud-native EDW (Snowflake, Redshift), Kafka, Athena
- Strong dimensional data modeling (Star, Snowflake) and ER modeling skills!
- Proficiency building and maintaining infrastructure-as-code preferably with terraform and AWS ecosystem!
- Proficiency in Java, Python, SQL
- Experience with writing and reviewing version-controlled code (GitHub)
- Experience optimally presenting insights and summarizing sophisticated data to diverse audiences through visualizations.
- To be a self-starter with the ability to take initiative and drive projects forward independently.
- Experience working with and delivering to collaborators from multiple parts of the company.
- Bonus :
- Some experience with API development
- Some experience with technologies like GraphQL, graph databases
- Chewy is committed to equal opportunity. We value and embrace diversity and inclusion of all Team Members. If you have a disability under the Americans with Disabilities Act or similar law, and you need an accommodation during the application process or to perform these job requirements, or if you need a religious accommodation, please contact
- CAAR@chewy.com
- If you have a question regarding your application, please contact
- HR@chewy.com
- To access Chewy's Customer Privacy Policy, please click here (https : / / www.chewy.com / app / content / privacy) . To access Chewy's California CPRA Job Applicant Privacy Policy, please click here (https : / / chewyinc.phenompro.com / us / en / privacy-policy) .