What are the responsibilities and job description for the Data Architect (Cloud) position at InCom Technologies Inc.?
Job Details
Data Architect (Cloud) Job Posting
As a Data Architect, you will Lead the creation of the strategic enterprise data architecture for Hyatt. Partners with internal stakeholders to define the principles, standards, and guidelines regarding data flows, data aggregation, data migration, data curation, data model, data consumption and data placements. Provide expertise regarding data architecture in critical programs, data strategy and data quality remediation activities. Validates data architecture for adherence to defined policies, standards and guidelines including regulatory directives.
YQUALIFICATIONS:
- 6 years of experience within the field of data engineering or related technical work including business intelligence, analytics
- 4 years of experience in architecture for commercial scale data pipelines
- Experience and comfort solving problems in an ambiguous environment where there is constant change. Have the tenacity to thrive in a dynamic and fast-paced environment, inspire change, and collaborate with a variety of individuals and organizational partners
- Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business
- Exposure to Amazon AWS or another cloud provider
- Experience with Business Intelligence tools such as Tableau, ThoughtSpot, PowerBI and/or Looker
- Familiarity with data warehousing platforms and data pipeline tools such as Redshift, Snowflake, SQL Server, etc.
- Passionate about programming and learning new technologies; focused on helping yourself and the team improve skills
- Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders
- Rigorous attention to detail and accuracy
- Aware of and motivated by driving business value
- Experience with large scale enterprise applications using big data open-source solutions such as Spark, Kafka, Elastic Search / Solr and Hadoop, HBase
- Experience or knowledge of basic programming and DB's technologies (SQL, Python, Cassandra, PostgreSQL, AWS Aurora, AWS RDS , MongoDB, Redis, Couchbase, Oracle, MySQL, Teradata)
- Bachelor s degree in Engineering, Computer Science, Statistics, Economics, Mathematics, Finance, a related quantitative field
- Advance CS degree is a plus knowledge and experience