What are the responsibilities and job description for the Data Engineer position at iMPact Business Group?
Our client is a custom software development company out of Grand Rapids, Michigan that exists to make software that humans actually like to use.
Type : Contract-to-hire position (Contract portion 3-6 mos)
Location : Hybrid work-week in Grand Rapids, MI (3 days per week)
Position Overview :
Our client is looking for a Data Engineer who is excited to help build the future of data infrastructure. You'll participate in designing and optimizing high-performance data pipelines, creating scalable and responsive data infrastructures that help organizations improve their business intelligence capabilities and prepare for the AI-powered world of tomorrow.
You will have the opportunity to work with the latest advancements in Microsoft Fabric, Azure Data Lake, Databricks, Snowflake, and other industry-leading tools.
This role is ideal for a mid-to-senior level engineer who thrives in a fast-paced, innovative environment and wants to push the boundaries of big data, AI integration, and cloud engineering. You'll collaborate with AI experts, analysts, and business leaders to create intelligent, scalable, and automated data solutions.
Career Growth Opportunity : As you develop your technical and leadership skills, you'll have opportunities to mentor junior engineers, take ownership of key projects, and, if desired, progress into a leadership role.
What You'll Bring :
Required Skills & Experience
SQL Proficiency : 3 years of hands-on experience with advanced SQL for data analysis, modeling, and
engineering.
Data Pipeline Development : Experience with ETL / ELT tools and modern data engineering frameworks.
Medallion Architecture Knowledge : Understanding of Bronze, Silver, and Gold data layers for
structured data transformation.
Cloud Data Engineering : Hands-on experience with Microsoft Fabric, Databricks, Azure Data Lake, or
Snowflake.
D atabase Management : Experience with SQL Server, PostgreSQL, SAP HANA, Progress, Oracle, Azure SQL or other relational databases.
Programming Skills : Proficiency in Python, C# or Java for data transformations and automation.
DevOps & Infrastructure : Experience with CI / CD pipelines, Git or Azure DevOps. Cloud Platforms :
Working knowledge of Azure, AWS, or Google Cloud services.
Bonus Points for :
Microsoft Data & AI certifications.
Experience with BI tools like Power BI, Tableau, or Looker. Interest in ML model deployment and
AI-driven analytics. Understanding of Azure and AWS networking concepts.
Knowledge of data governance frameworks and compliance standards (GDPR, HIPAA, SOC 2).
Salary : $60 - $65