What are the responsibilities and job description for the Job Big Data Solutions Architect at Charlotte, NC position at Lorven technologies?
Job Description
Job Description
Role : Big Data Solutions Architect
Location : Charlotte, NC 3 days Onsite
Contract Job On W2
Job description :
This job is responsible for innovating and leveraging technologies in the Big Data, Data Analytics and Data Warehousing realms. Key responsibilities include delivering the architecture and usage patterns across a wide variety of solutions, partnering other analytic teams, security and tenants to meet a diverse usage of delivery and access patterns. We strive to build fault tolerant design patterns that support zero to minimal downtime in support of distributed platforms, complex compute environments, and Cloud technologies in pursuit of creating best in class big data ecosystems. They must be able to work with end users, stakeholders, executives, system administrators, and vendors to help deploy those solutions from POC to Production.
Manages relationships with business and technology leaders and vendors for technical products and creates an inclusive and healthy working environment to resolve organizational impediments and blockers.
Creates the technology strategy for a respective technical domain, aligning execution with product strategy by working with Product Managers ,Product team members, and other stakeholders.
Design and architect robust data warehouse solutions that support complex analytical workloads.
Lead the evaluation and selection of big data technologies, tools, and frameworks.
Develop data models, ETL processes, and data pipelines to ingest, transform, and load data efficiently.
Collaborate with cross-functional teams to gather requirements and translate them into architectural designs.
Ensure data quality, integrity, and security across all data platforms.
Optimize performance of existing data processes and warehouse architecture.
Mentor and guide junior architects and data engineers, promoting best practices in data architecture.
Stay current with industry trends and emerging technologies in big data and data warehousing.
Provide thought leadership in big data strategy and execution.
BS in Computer Science, information Technology or related field.
5 years of technical solutions experience including 3 years in relevant enterprise data platforms and processing on Hadoop, Teradata, and Data Transformation Products
Enterprise Business Intelligence and Analytics.
Varied data processing engines and strategies : Databricks, Snowflake, Big Query, Watson X, Starburst / Trino
Experience in multiple programming disciplines including Java, Python, and Scripting Languages.
Deep understanding Virtualize and Containerized platforms : Openshift, Kubernetes, docker, VMware
Understanding of storage and data replication platforms : CEPH, Cirata, Ozone, MinIO, Kudu, HDFS
Understanding of Network and Security principals, Creating designs that adhere to Security Baselines
Understanding of Cloud architectures including IaaS, PaaS, SaaS, compute, storage, network, and security solutions that run on large Cloud service providers (Azure, Google, AWS).
Ability to work independently and lead a project team
Skills : Linux
Python
PostgreSQL
Ansible in a secure environment
Agile