What are the responsibilities and job description for the Data Engineer position at Gravity IT Resources?
Title: Associate Data Engineer
Location: Nashville, TN (Franklin – 5 days onsite)
Job Type: FTE
You must be legally authorized to work in the U.S., now and in the future without sponsorship. Unfortunately, we are unable to work with candidates requiring visa sponsorship or work authorization transfers.
Summary:
The Data Engineer plays a critical role in building data products and supporting Our Customer’s BI Team. This role is responsible for designing and developing solutions utilizing various ETL technologies outlined in the qualifications below. Job duties include implementing architected ETLs, assembling data from various sources, and developing operational data solutions. This is a collaborative position that will partner with both technical and non-technical colleagues to support the growth strategy.
This job requires in-person presence in our Nashville, TN (Franklin) office location (5 days onsite).
Primary Responsibilities:
- Develop, maintain and co-design ETL solutions
- Extract and transform data from disparate sources into internal formats for loading into our platform
- Follow application standards for development, documentation, and deployment
- Scrutinize present data infrastructure to identify gaps and implement resolutions
- Troubleshoot production support issues and restore them per established SLAs
- Develop plans to integrate the current systems with desired future data models or solutions
- Collaborate with colleagues in the delivery of data solutions and code review
- Consult with subject matter experts to document and address data quality issues
- Break down abstract requirements and into smaller components, patterns, views, and features
- Communicate technical concepts to non-technical audiences, and business concepts to technical audiences
- Collaborate with other team members in the execution of large-scale projects
- Perform other duties and responsibilities as required, assigned, or requested
- Design complex ETL/ELT workflows for Data Lakes and Data Warehouses using various data sources.
- Lead the integration of new tools.
Required Qualifications:
- Bachelor’s degree in computer science, related technical field, or equivalent work experience
- Optimize workflows for large-scale data processing and analytics.
- Mastery of ELT design principles for scalability and reliability.
- 3 years of experience building clean, maintainable, and well-tested code
- 1 years of experience with visual data solution development in Tableau
- 3 years of experience in T-SQL, Alteryx and any other ETL/ELT tools.
- Excellent communication skills to collaborate with stakeholders at all levels of the company
- Bonus points for background in data engineering, science, or analytics
Preferred Qualifications:
- Snowflake, MS SQL Server, Azure, Salesforce experience
- Experience using data visualization tools, such as Tableau
- Experience with architecting, designing, implementing, managing, and supporting OLTP, OLAP or Warehouse Database Management Systems preferred
- Understanding of quality assurance and data quality principles as applied to an ELT architecture
- Strong understanding of star schema, dimensional modeling, Kimball methodology
- Nice to have one or more certifications such as Snowflake SnowPro, Advanced: Architect.
- Microsoft Certified: Azure Data Engineer Associate.
- Alteryx Designer Core Certification.
- Microsoft Certified: Azure Data Fundamentals.
Salary : $90 - $100