What are the responsibilities and job description for the Sr. Data Engineer Architect w/d strong (ETL, Informatica MDM, DBT & Snowflake) experience position at Urbench, LLC?
Job Details
Sr. Data Engineer/ Architect w/d strong (ETL, Informatica MDM, DBT & Snowflake) experience
Primary Responsibility:
Client is searching for a hands-on business-focused technology lead. The ideal candidate will have functional knowledge in Sales and Marketing processes and core data engineering technologies such as databases, BI, ETL and AI tools. The individual will work closely and partner with global sales enablement, analytics client management, operations, and finance teams to deliver on projects that provide the capabilities defined in the target operating model.
- Setting up Semarchy MDM tools and maintenance else should have MDM product experience ( There are 25 tools in this space such as Informatica MDM, IBM MDM, Reltio MDM, Profisee MDM etc) (Mandatory)
- Strong database management skills Snowflake/SQL/ DBT. (Mandatory)
- Strong cloud infra e.g. ADLS & ADF.
Dear Partners!
Kindly extend your support for this role, this is critical role and I need 100% matching profile only.
Sr. Data Architect/Lead Data Engineer (15 Years)
Location: Ney York City, NY (3 days onsite 2 day remote)
Primary Responsibility:
Client is searching for a hands-on business-focused technology lead. The ideal candidate will have functional knowledge in Sales and Marketing processes and core data engineering technologies such as databases, BI, ETL and AI tools. The individual will work closely and partner with global sales enablement, analytics client management, operations, and finance teams to deliver on projects that provide the capabilities defined in the target operating model.
Setting up Semarchy or any MDM tools and maintenance. (Mandatory)
Strong database management skills Snowflake/ DBT. (Mandatory)
Strong cloud infra e.g. ADLS & ADF.
Expertise in relational and dimensional data modeling.
Designing and developing efficient data ingestion processes, data models, and data pipelines leveraging the power of Snowflake/ Redshift / Bigquery
Collaborating extensively with clients to gain a deep understanding of their data requirements and translating them into robust technical solutions.
Implementing end-to-end ETL/ELT workflows to seamlessly extract, transform, and load data into Snowflake/ Redshift / Bigquery
Optimizing data pipelines to achieve exceptional performance, scalability, and reliability.
Conducting thorough data quality assessments and implementing effective data governance best practices.
Monitoring and troubleshooting data integration processes to ensure the utmost accuracy and integrity of data.
QUALIFICATIONS AND EXPERIENCE: (Academic, Professional, Experience)
Excellent oral and written, and problem-solving skills are required.
Candidate should be comfortable working in a fast-paced environment and can help build APIs, Calculators, on new cutting-edge cloud and big data technologies such as Snowflake, Fivetran, DBT, Python etc.
Top-notch lead engineering talent with 8 years of experience building and managing data related solutions preferably in financial industry.
Experience using ETL/ELT tools and technologies such as Azure Data Factory, SSIS a plus.
Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime, and reliability in mind.