What are the responsibilities and job description for the Big Data Modeler (Data Vault 2.0 Expertise Required) position at Kaav Inc.?
Job Details
We're Hiring! | High-Priority Role Big Data Modeler (Data Vault 2.0 Expertise Required)
Job Title: Big Data Modeler (Data Vault 2.0 Expertise Required)
Location: Las Vegas, NV (Onsite)
Experience Required: 12 15 Years
Start Date: Immediate / Urgent Requirement
Domain Experience Preferred: Financial Services
About the Role
We are actively seeking a Senior Big Data Modeler with strong, hands-on experience in Data Vault 2.0 to support the data strategy of a key client. This is a mission-critical role focused on building scalable, cloud-native data architecture using modern platforms and modeling practices.
Top Primary Skills Must-Have
Data Vault 2.0 Proven experience in real-world implementations
Data Modeling Strong in relational, dimensional, star, snowflake, normalized/denormalized models
Cloud Data Platforms Snowflake, Hive, Redshift
Data Lakehouse Experience with architecture and tools like S3, Hive, Trino, HUE
Big Data Ecosystem Strong knowledge of Hadoop/Hive/S3 integration and schema design
Data Governance & Quality Understanding of MDM, Data Dictionary, Data Mapping
Data Modeling Tools ERwin, ER/Studio, or Toad Data Modeler
SQL Expertise Ability to work with complex datasets across SQL Server, Oracle, DB2
Excellent Communication Comfortable working with cross-functional stakeholders
Agile & Collaboration Experience working in agile teams with global team alignment (including Pacific Time overlap)
Key Responsibilities
- Design enterprise-scale data models (100 entities) for both on-prem and cloud environments
- Lead implementation of Data Vault 2.0 in cloud-native data environments
- Collaborate with business analysts, product owners, DBAs, and data engineers to translate business requirements into data models
- Define and implement data governance, data quality, and data lineage strategies
- Work with modern big data stacks: Hive, S3, Trino, HUE, Snowflake
- Develop logical and physical data models using tools like ERwin, ER/Studio, or Toad
- Apply industry best practices including Bill Inmon and Ralph Kimball methodologies
- Participate in agile ceremonies and have at least 2 hours of daily overlap with Pacific Time Zone
Qualifications
- Bachelor's or Master's degree in Computer Science, Information Systems, or a related field
- Minimum of 12 years in data architecture and modeling
- Financial services domain experience is a strong plus
- Strong documentation and presentation skills
- Ability to lead and mentor junior data engineers and analysts
Thanks & Best Regards,
Sivaji Katta
Email:
LinkedIn: