André is Hiring a Data Engineer - Data & Analytics Near Bethlehem, PA
André Global Inc. is a global provider of information technology services. Since 2002, we have helped our clients in the APAC region achieve their business goals by leveraging the power of technology. In 2016, we entered the AMERICAS to re-create similar success here as well. Headquartered in New York City, we are now serving our global clients. We strive to deliver customer satisfaction through our service quality to all our clients! Our global and diverse team helps our customers achieve their project goals at a faster pace with high affordability but without any compromise on quality. We take pride in providing an outstanding quality of service to all our customers starting from startups, small businesses to Fortune 500. With more than a decade of experience servicing customers across geographies, we have stationed ourselves at a unique position in the global market. Our ability to assemble the best workforce in record time has been a great asset to our clients. This is a contract position with our Insurance Client.Location: Bethlehem, PA. The resource will be required to work onsite in our Bethlehem, PA office 3 days per week. We will not accept remote candidates.MUST Be extremely strong in SQL programming as there will be a live coding exercise in SQL programming during the interview Must have genuine experience working in Data Bricks Responsibilities
Develop and maintain data marts using databricks to support business analytics and reporting use cases.
Develop & Implement ETL/ELT processes to extract, transform and load data using an existing framework.
Collaborate with business stakeholders to gather requirements and ensure the datamarts/data assets that are built meets the business needs.
Create & maintain data dictionary/ETL mapping documents.
Reverse engineer existing data prep processes and convert them into reusable governed and certified data assets.
Implement data quality checks and ensure data integrity within the datamarts /data assets.
Qualifications
Proven experience as a Data Engineer, ETL developer or in a similar role
Proficient in SQL and Python.
Knowledge of Databricks.
Familiarity with data warehousing concepts and best practices.
Strong understanding of data modeling, and in particular dimension modeling.
Excellent problem-solving skills and the ability to multitask, work independently and as part of a team.
Good communication skills and the ability to collaborate effectively within and across the team.