What are the responsibilities and job description for the AWS Data Architect (W2) position at Torque Technologies LLC?
Job Details
Role: AWS Data Architect
Location: Erie, PA
Long Term
Job Description:
TOP Skills Required : Data Warehouse concepts, Data Pipeline, Data Modeling, Data Lake, AWS S3, Glue, Red Shift, Data Bricks, Snowflake
Good to have Advanced/Latest Data Concepts: Such as Data Lakehouse, Data Mesh, data related enterprise issues data governance, data security etc .
Key Responsibilities:
Our customers are looking for differentiating their businesses by leveraging Data as a strategic asset. You goal would be to engage customers in envisioning and building creative modern data solution such as cloud native Data platform implementation, BI and Analytics solutions.
Collaborating with customer stakeholders, IT leadership, Architecture leadership to develop modern data solution and architecture, especially for cloud native Data platforms solution architecture using AWS cloud platform
Developing technology roadmaps and best practice with precise steps of how to get there including architecture design, data modelling, and implementation of Modern Data platform and analytics use cases across the analytics maturity.
Lead strategy session for Data & Analytics maturity with a focus of fulfilling business requirements around performance, scalability, flexibility, and cost
Provide insights into modern Data & Analytic technologies and new use cases business intelligence and predictive analytics
Lead design, prototyping and delivery of software solutions within the cloud data eco-system to help develop new capabilities for customer business (P&C Insurance)
Lead modern data solution conversations as technology expert to help customer see modern data solutions as enable for organization to be data driven.
Assess the Customers' knowledge of AWS platform and overall cloud readiness to support customers through a structured learning plan and ensure its delivery.
Desired Skill, Experience
Should have 15 years of experience with last 4 years in implementing Cloud native Data Solutions for variety of data consumption needs such as Modern Data warehouse, BI, Insights and Analytics
Should have experience in architecture and implementing End to End Modern Data Solutions using AWS and advance data processing frameworks like Databricks etc.
Strong knowledge of cloud native data platform architectures, data engineering and data management
Good knowledge of popular database and data warehouse technologies from Snowflake and AWS
Demonstrated knowledge of data warehouse concepts. Strong understanding of Cloud native databases, columnar database architectures
Ability to work with Data Engineering teams, Data Management Team, BI and Analytics in a complex development IT environment.
Good appreciation and at least one implementation experience on processing substrates in Data Engineering - such as ETL Tools, Confluent Kafka, ELT techniques
Exposure to varying databases NoSQL (at very minimum Key value stores and/or Document stores), Appliances. Be able to cite implementation experiences constraints and performance challenges in practice.
Preferable (Nice to have): Implementing analytic models using AWS SageMaker for production workloads.
Data Mesh and Data Products designing, and implementation knowledge will be an added advantage.