What are the responsibilities and job description for the Data Engineer position at VDart Inc?
SR Data Engineer
East Windsor, NJ
Short term Contract To Hire
Key Responsibilities:
- Working with our business partners to develop data mapping from one system to another. As such the candidate will be expected to manage data mapping from a legacy system to a new system and across 20 enterprise systems with overlapping datasets. The candidate will have a solid attention to detail and be able to manage multiple projects and data streams.
- AmSpec is in the process of onboarding several enterprise software systems across its functions. As we migrate to these new systems, hybrid data feeds will have to be managed for consistency and continuity.
- Manage the data engineering lifecycle including research, proof of concepts, architecture, design, development, test, deployment, and maintenance using tools such as Snowflake, DBT, and Python
- Plan, design and implement framework and systems for the data warehouse. Create data models, data marts and ELT/ETL processes.
- Build a modern analytics stack supporting a variety of Cloud-based Business Systems for potential clients using Snowflake, ADF, and Power BI
- Implement data governance frameworks within Snowflake/DBT, including data quality checks, lineage, and metadata management
- Design solutions to support real-time or near-real-time processing for data transactions and monitoring, enabling timely insights. Optimization of data warehouse models and servers to ensure top level performance and scalability.
- Communicate and demonstrate a clear understanding of client business needs, goals, and objectives
- Optimize Snowflake performance, including query tuning and storage optimization and employing Snowflake / DBT best practices.
- Develop and maintain ETL processes to integrate data from various sources.
- Monitor and troubleshoot data issues and implement solutions.
- Create data marts / views with the objective of supporting dashboards, reports, and other views for corporate KPIs and operational monitoring of the business (revenue, cost, efficiency, pricing, and so on).
- Research and develop POCs to demonstrate business capabilities using Snowflake.
- Develop predictive analytics and ML-based modeling in the long term to drive forward looking business decision making.
- Partner with IT teams to align data solutions with enterprise governance standards.
- Perform thorough testing and validation of data integration process to ensure data integrity and accuracy.
- Work closely with IT team and stakeholders to understand data requirements and deliver integrated solutions to meet our business needs.
- Create and maintain documentation of the data integration process, configuration and best practices.
- Support data integrations through our Datawarehouse for ERP, Finance, HR, Ops, IT, and other enterprise systems.
- Overall responsible for enterprise data integration architecture and governance
Experience / Skills:
- Strong expertise in Snowflake architecture, DBT (or similar SQL-based transformation tool), SQL, and data modeling
- Minimum 5 years of experience programming in languages that support data integration (python)
- Work with Azure (or similar) cloud infrastructure to manage data storage and ensure security compliance.
- Extensive understanding of Datawarehouse infrastructure design, deployment and management.
- Attention to detail and ability to detect and correct issues in complicated code.
- Experience with web services and SOA architectures.
- Data Architecture, Data Engineering, Data Analysis and Profiling
- PowerBI
- Azure cloud platforms.
- SQL, Python, PySpark and Linux shell scripting.
- Data modeling, schema design, database concepts.
Salary : $117,000 - $140,500