What are the responsibilities and job description for the Team Lead/PM position at UsefulBI Corporation?
Job Title: Team Lead / Technical PM
Location: Raleigh, North Carolina
Experience Required: 7 years
Job Type: Hybrid – 3 days a week office.
About the Role
We are looking for a Lead Data Engineering Manager who can wear multiple hats — technical lead, project manager, and stakeholder liaison — for data engineering initiatives. This is a hybrid role based in Raleigh, NC, suited for someone who thrives in fast-paced environments, takes ownership, and can lead by example both technically and strategically.
The ideal candidate has hands-on experience in AWS, SQL, Python, PySpark, and Databricks, and can step into the codebase when needed, while also leading a team, managing project delivery, and collaborating closely with internal and external stakeholders.
Key Responsibilities
- Lead and manage a team of data engineers working on complex data projects.
- Serve as the technical point of contact, overseeing project timelines, scope, and deliverables.
- Build, maintain, and optimize scalable data pipelines and architectures using AWS, Databricks, Python, SQL, and PySpark.
- Act as a bridge between technical teams, product managers, and business stakeholders, ensuring clarity of goals and alignment on priorities.
- Step in and contribute to coding and troubleshooting when needed to unblock teams or drive critical features.
- Ensure robust documentation, knowledge sharing, and process improvement practices are in place.
- Drive project planning, sprint management, and progress tracking using Agile methodologies.
- Own communication with internal leadership and external clients, ensuring expectations are managed and updates are clear.
- Promote best practices in data governance, quality, and operational excellence.
Required Qualifications
- 7 years of experience in data engineering, with a proven track record of leading technical teams.
- Proficient in AWS services (S3, EMR, Glue, Redshift, Lambda, etc.).
- Strong skills in Python, SQL, PySpark, and Databricks.
- Experience managing full lifecycle data engineering projects — from planning and design to deployment and support.
- Excellent communication and interpersonal skills — capable of engaging with both technical and non-technical stakeholders.
- Hands-on experience with ETL/ELT pipelines, CI/CD, and data orchestration tools like Airflow.
- Ability to context-switch between strategy, management, and execution.
- Comfortable working in hybrid environments and coordinating across distributed teams.
Nice to Have
- Experience working with clients directly in a consulting or delivery capacity.
- Knowledge of Infrastructure as Code (Terraform, CloudFormation).
- Familiarity with data security, compliance, and governance frameworks.
- Exposure to REST APIs and third-party system integrations.