Description
Job Summary
The Senior Data Engineer is a technical leader and hands-on developer responsible for designing, building, and optimizing data pipelines and infrastructure to support analytics and reporting. This role will serve as the lead developer on strategic data initiatives, ensuring scalable, high-performance solutions are delivered effectively and efficiently.
The ideal candidate is self-directed , thrives in a fast-paced project environment , and is comfortable making technical decisions and architectural recommendations . They will work closely with cross-functional teams, including business stakeholders, data analysts, and engineering teams, to develop data solutions that align with enterprise strategies and business goals .
Experience in the financial industry is a plus, particularly in designing secure and compliant data solutions.
Essential Functions
Project Leadership & Development
- Serve as the lead developer on strategic data projects, driving technical decision-making and execution .
- Own the end-to-end development lifecycle of data pipelines, from architecture and design to deployment and optimization.
- Independently identify and resolve technical challenges , ensuring solutions are scalable, secure, and efficient.
- Work with cross-functional teams to define technical requirements and ensure alignment with business objectives.
- Lead code reviews, enforce best practices, and provide mentorship to junior engineers.
Data Engineering & Architecture
Design, build, and maintain scalable ETL / ELT pipelines for structured and unstructured data.Optimize data storage, retrieval, and processing for performance, security, and cost-efficiency.Automate manual data processes and improve operational efficiency.Ensure data integrity and governance by implementing robust validation, monitoring, and compliance processes .Design and implement data solutions that comply with financial industry regulations (if applicable).Self-Direction & Strategic Impact
Take ownership of projects, proactively identifying opportunities for innovation and process improvement.Drive the adoption of modern data engineering practices and tools.Continuously assess emerging technologies to recommend enhancements to existing systems.Work autonomously while collaborating effectively with business and technical stakeholders.Data Governance & Security
Enforce data security, compliance, and governance best practices.Ensure solutions meet regulatory and financial industry compliance standards (if applicable).Maintain clear documentation for data architectures, transformations, and workflows.Additional Responsibilities
Coach and mentor junior engineers, fostering a collaborative and innovative team environment.Communicate effectively with stakeholders regarding project status, risks, and delivery timelines .Lead technical discussions, architectural reviews, and proof-of-concept initiatives .Requirements
Qualifications
Required :
Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).10 years of experience in data engineering, data modeling, and cloud technologies .Proven experience as a lead developer in a project setting, with a track record of delivering complex data solutions.Expertise in Azure-based data technologies , including :Databricks
Azure Data FactorySQL (T-SQL, stored procedures, query optimization)Python for data processing and automationPower BI for data visualization and reportingSelf-motivated and able to work independently , making architectural and technical decisions as needed.Strong understanding of Agile and Waterfall methodologies .Excellent problem-solving, analytical, and communication skills.Strong customer service mindset , treating data consumers as internal customers.Preferred :
Experience leading distributed teams and interfacing with senior leadership .Financial industry experience is a plus , particularly in regulatory-compliant data processing.Proficiency in R and Microsoft Office Suite .Familiarity with CI / CD pipelines for data engineering workflows .Experience implementing big data processing frameworks, data lakes, and real-time streaming solutions .