What are the responsibilities and job description for the Senior Data Product Analyst position at LPL Financial?
Job Overview :
As AVP, Data Product Analyst, you will have the opportunity to shape LPL’s strategic data foundation for the future. In this role, you will connect Business product owners with Technology adelivery teams using data to power applications and experiences for our Advisors and Investors. You will be accountable for multiple domains of data that power products including Advisor and Investor platforms, performance reporting, supervision, account lifecycle management, billing, compensation, trading, money movement, proposal generation, CRM, BI reporting and institutional data distribution, etc.
Responsibilities :
Author both operational and analytical data quality rules for respective data and business domains. Document the rules and partner with technology teams and engineers to have the rules built into data pipelines, APIs, a commercial tool quality framework, etc.
Develop data profiling, quality, and alerting capabilities leveraging database and ETL capabilities (DBT, Airflow, SQL, etc.) or commercial data management tools (Collibra, Alation, atlan, AWS Datazone, data.world, etc.)
Develop and document standardized processes and workflows for remediating data issues by working with Relationship Management, Servicing, and Operations teams. Assist with data issue research, impact analysis, and fixes by partnering with support and technology teams.
Translate business product, application and capability requirements for Advisors and Investors into data needs and solutions. These include business logic and transformations specs for data pipelines, inputs and business context for data modeling and logic embedded in consuming APIs.
Utilize homegrown solutions, commercial data governance tools, or Excel, to build and document data lineage views of data transformations across the data ecosystems from source to target. Maintain data lineage artifacts and leverage them for data research and remediation efforts. Work with enterprise data architects to document data flows in conceptual and reference architecture diagrams for accountable data and business domains.
Use either commercial governance tools or manually capture, document, and maintain data dictionaries at the attribute level for accountable domains. Partner with the enterprise data modelling team to ensure consistency of data definitions, classifications, taxonomies, hierarchies, etc. across data assets and consuming applications.
Work with data product peers to identify inconsistencies in data to standardize on data definitions, code sets, naming conventions, classifications, consumption contracts, and taxonomies.
Define and configure workflows, metadata structures, data quality rules, and classification schemes based on defined requirements.
Author and enter Feature and Story-level work in Jira for technology teams to execute against.
Work with data domain teams and the enterprise modeling team to understand both the domain-level data models and the enterprise data model to determine if fit for purpose for new project intake or if changes are required for the data domain. Provide scope and level of complexity if development work needed. Document drift and gaps between the enterprise logical data model and domain-level physical data models.
Represent data and participate in QE and end-to-end testing of data products.
Populate and enrich the business glossary or data dictionary and supporting and resolving data quality issues.
Leverage software. Applications, and tooling to construct, monitor, govern and maintain new data products for the company. Data products will be developed on both the producing and consuming sides of the organization.
Manage the data sharing and data delivery agreements as defined by the Data Design Authority, business stakeholders, data modeling and engineering teams.
Accountable for partnering with engineering to implementation and execution of the data quality and privacy rules.
Ensure alignment of data schemas, models, contracts, and templates from ingestion to the enterprise data model to systems of record domains to consumption layers. Align and govern physical models to logical data models.
Partner with the Risk organization for data archiving, retention, privacy, and data destruction policies and requirements for critical data elements.
Business metadata management across accountable data domains and data products.
Onboard, govern and certify new CDEs introduced into the data ecosystem.
Perform technical traceability mapping and data lineage tracking.
Validate and certify new data sources.
Partner with data domain (SOR) teams and business stakeholders to author and implement data quality rules and standards.
Operational oversight of data quality rules execution in partnership with Engineering and Operations.
Partner with a variety of internal support and Operations teams to build data quality scorecards, data exception reports, and ultimately assist teams with data enrichment and remediation when required.
Track data issues and risks and develop associated mitigation strategies for corresponding business domains.
Update central metadata repository with business metadata content including business terms, definitions, classifications, and data quality rules.
What are we looking for?
We want strong collaborators who can deliver a world-class client experience. We are looking for people who thrive in a fast-paced environment, are client-focused, team oriented, and can execute in a way that encourages creativity and continuous improvement.
Requirements :
5 years’ experience in data warehouse design, data management, data governance, business intelligence and analytics, or similar.
5 years and expert level use of reading and coding in SQL.
Proven experience in individual contributor-level data management roles like data analyst, data modeler, data pipeline engineer, BI analyst, or developer.
Solid knowledge base of data governance including data quality rules coding and implementation, data catalogs, quality scorecard development, data lineage, mastering data, and reference data management.
3 years’ experience with business intelligence tools and report development in tools like Tableau, Microsoft PowerBI, Domo, Qlik, Sisense, etc.
Experience using (data steward) or implementing (technology) data governance tools like Collibra, AWS Datazone, Alation, atlan, data.world, etc.
Core Competencies :
Hands-on data analysis, source-to-target mapping, data lineage tracing, remediation, etc.
Experience coding or reading and understanding data transformations in pipelines.
Experience designing and implementing a variety of data architecture patterns.
Experience using and developing in a variety of RDBMS like AWS, Databricks, Snowflake, Redshift, etc.
Preferences :
Wealth management and wealth advisory services industry experience.
Prior people leadership experience.
Prior experience in a data product management role
Agile and lean delivery methods and associated tools like Confluence, Jira, etc.
Solid understanding of relational, star, snowflake data modeling.