What are the responsibilities and job description for the Assistant Vice President position at SG Analytics?
About the Role
We are looking for a Technical Leader for our Data Engineering practice to join our onsite team based out of New York and drive data engineering solutions in the media industry. The ideal candidate will have account management and delivery leadership experience of over 10 years and will bring deep expertise in data engineering using Python, PySpark, Databricks, Snowflake, and AWS while being comfortable working with Nielsen and social media data. Exposure to Retool for building internal tools is a plus. The candidate should be experienced and hands-on with designing scalable data pipelines, handling large-scale media datasets, and optimizing data workflows to support analytics, reporting, and machine learning.
Key Responsibilities
- Manage a large account with delivery and revenue responsibilities.
- Lead a team of approximately 15 talented data engineering professionals in a hybrid (on-shore off-shore model)
- Be responsible for thought leadership in data engineering expertise.
- Be hands-on: Design, develop, and optimize scalable data pipelines using Python, PySpark, and Databricks.
- Work with Nielsen, social media, and streaming data to generate insights for media performance and audience analytics.
- Implement ETL/ELT processes in Snowflake and AWS data ecosystem.
- Manage AWS-based data architectures, including S3, Glue, Lambda, Athena, and Redshift.
- Work closely with data scientists, analysts, and business teams to ensure data availability and usability.
- Develop data models, warehouses, and marts optimized for analytics and business intelligence.
- Leverage Databricks for distributed data processing and performance tuning.
- Implement best practices for data governance, security, and compliance in media data processing.
- Utilize Retool (introductory knowledge required) to create internal tools for operational efficiency.
- Troubleshoot data quality issues, performance bottlenecks, and provide optimizations.
Required Skills & Experience
- 12 years of experience in data engineering with a focus on big data and cloud technologies.
- Strong proficiency in Python, PySpark, and SQL.
- Hands-on experience with Databricks, including optimizing Apache Spark workloads.
- Experience working with media datasets, particularly Nielsen ratings, social media engagement, and digital content analytics.
- Expertise in Snowflake, including data modeling, Snowpipe, and performance optimization.
- Solid experience in AWS cloud services (S3, Glue, Lambda, Redshift, Athena, Step Functions, etc.).
- Understanding of streaming data processing (Kafka, Kinesis, or similar) is a plus.
- Familiarity with Retool for internal tool development.
- Strong problem-solving skills with a proactive approach to identifying and resolving data challenges.
- Excellent communication skills, with the ability to work cross-functionally.
Preferred Qualifications
- Experience with media audience measurement, ad performance, or content engagement analytics.
- Knowledge of machine learning integration in data pipelines.
- Exposure to data visualization tools such as Tableau, Power BI, or Mode Analytics.
- Experience working in Agile/Scrum environments.