What are the responsibilities and job description for the Data Engineer 1 position at Numerator?
Numerator is seeking a Data Engineer I to play a pivotal role in driving decision-making, uncovering new opportunities, and enhancing our rapidly evolving platforms. In this role, you will lead initiatives to automate, enhance, maintain, and scale services in a dynamic, fast-growing environment.
As a Data Engineer I, you’ll be instrumental in delivering data products, analytics, and models efficiently and independently. This cross-functional role involves designing and developing robust data pipelines, building infrastructure to evaluate and deploy data engineering models, and collaborating closely with software engineers. Your work will have a broad impact across Numerator as you expand and enhance our technology platforms, which span multiple software products. This role offers high visibility, growth, and the opportunity to drive impactful projects from concept to production.
What You'll Do :
Collaborate with Product, Analytics, Data Science, and Engineering teams to build or enhance data products while ensuring adherence to data quality standards
Lead complex, end-to-end projects focused on improving data quality and ensuring statistical models (e.g., sampling, segmentation, classification, predictive modeling) are supported by clean and validated data
Design and develop pipelines that enforce data validation, quality checks, and best practices for integrating Data Science models into customer-facing products
1 years of experience designing data warehouses, building data pipelines, or working in data-intensive engineering roles with a strong focus on data quality
Proficiency in a major programming language (preferably Python) and SQL with experience in implementing data validation and transformation processes
Expertise in data modeling, ETL design (especially Airflow and Dbt), and ensuring that data transformations meet business goals while maintaining integrity and quality
Experience designing and deploying cloud-based production solutions (AWS, Azure, or GCP), ensuring data quality is maintained across environments
Strong attention to detail, intellectual curiosity, and a commitment to delivering high-quality data in a fast-paced, collaborative environment
Nice to haves
Familiarity with Amazon Web Services (EC2, RDS, EKS).
Experience with Terraform and / or Ansible (or similar) for infrastructure deployment
Familiarity with Airflow building and monitoring DAGs and developing custom operators
Familiarity with Dbt and Airbyte for data transformation and migration.
Familiarity with Snowflake, Databricks, or another data warehouse.
Exposure to containerized services (Docker / Kubernetes)
Experience working with marketing insights, shopping data, or in the retail industry
What we offer you
An inclusive and collaborative company culture - we work in an open environment while working together to get things done and adapt to the changing needs as they come.
An opportunity to have an impact in a technologically data-driven company.
Ownership over platforms and environments in an industry-leading product.
Market competitive total compensation package
Volunteer time off and charitable donation matching
Strong support for career growth, including mentorship programs, leadership training, access to conferences, and employee resource groups.
Regular hackathons to build your projects and Engineering Lunch and Learn.
Great benefits package including health / vision / dental, unlimited PTO, flexible schedule, RRSPs matching, travel reimbursement, and more.
An opportunity to work in a fast-moving growth company.