What are the responsibilities and job description for the Senior Data Engineer position at Smiley Technologies, Inc.?
Job Type
Full-time
Description
Summary:
The Senior Data Engineer will be integral in designing, implementing, and supporting our Azure cloud-based analytics stack, focusing on Azure Data Factory, Databricks, DBT, and Jinja. The role also requires a deep understanding of OLAP Kimball methodology and experience with on-premise MS SQL Server data warehouses. This position involves collaborating with technical and business stakeholders to meet operational and strategic reporting and analytical needs.
Responsibilities
Perform other duties as assi
Requirements
Required Knowledge, Skills, and Abilities:
Full-time
Description
Summary:
The Senior Data Engineer will be integral in designing, implementing, and supporting our Azure cloud-based analytics stack, focusing on Azure Data Factory, Databricks, DBT, and Jinja. The role also requires a deep understanding of OLAP Kimball methodology and experience with on-premise MS SQL Server data warehouses. This position involves collaborating with technical and business stakeholders to meet operational and strategic reporting and analytical needs.
Responsibilities
- Data Warehouse Design & Architecture:
- Provide expertise in the design, architecture, and integration of data warehouses, focusing on Azure cloud analytics stack.
- Define processes to ensure alignment with business goals and objectives through robust data warehouse architecture and design.
- Data Modeling & Development:
- Develop conceptual, logical, and physical data models.
- Design and develop new data structures in OLTP and OLAP databases.
- Build and maintain ETL pipelines using Azure Data Factory and other relevant tools.
- Data Transformation & Integration:
- Utilize Databricks, DBT, and Jinja for data sourcing, transformation, and integration.
- Perform impact analysis on data structure changes and ensure compliance with database standards, including SPII, PCI DSS, and data security.
- Performance & Optimization:
- Review and optimize SQL queries for performance improvements.
- Manage automated database job schedules and support global metadata data dictionary repositories.
- Identify and resolve logical issues between database models and reports.
- Collaboration & Mentorship:
- Provide cross-training and mentorship to team members.
- Collaborate with different teams for infrastructure changes and support after-hours based on on-call rotation.
- Technology Adoption & Innovation:
- Stay updated on new technology trends and advise on potential adoption to maintain a competitive edge.
- Manage database project changes using GIT branching and integrations.
Perform other duties as assi
Requirements
Required Knowledge, Skills, and Abilities:
- Expert knowledge of OLAP dimensional database design, specifically using Kimball methodology.
- Advanced SQL programming skills and proficiency in MS SQL Server, SSIS, and Azure Data Factory.
- Experience with Azure Synapse, Data Lake, Databricks, DBT, Stream Analytics, and EventHub.
- Expertise in creating reports and dashboards using industry-standard tools, with a preference for PowerBI.
- Deep understanding of data marts, data warehouse architecture, multidimensional databases, data lakes, and data migration.
- Expert in performance tuning, query optimization, and using monitoring and troubleshooting tools like Performance Monitor and SQL Profiler.
- Knowledge of the software development lifecycle, including systems analysis, design, development, testing, and deployment.
- Familiarity with ITIL, Agile, SDLC, and DevOps methodologies.
- Strong communication skills, including facilitation, presentation, and documentation.
- Ability to capture and transform business requirements into functional and data specifications.
- High level of initiative, self-starter, flexible, proactive, and a team player.
- Willingness to be on-call and work with different teams for rotations for the data catalog.
- Additional Skills:
- Mainframe and DB2 experience is a plus.
- Linux/Unix experience is a plus.
- DevOps mindset, infrastructure as code, and DataOps for the data warehouse.
- Bachelor of Science Degree or Master’s Degree in Computer Science or a related technical discipline is required.
- Five years of relevant experience.