What are the responsibilities and job description for the Data Engineering Lead - Cloud position at Egen?
About Egen:
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms to help our clients drive action and impact through data and insights.
Our more than 700 technology specialists in the United States, Canada, and India have extensive knowledge and experience working across all the leading cloud and data platforms, with a strategic focus on Google Cloud and Salesforce.
We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better.
We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. If this describes you, we want you on our team.
Join an innovative culture
We have the honor of repeatedly earning Great Place To Work Certification, along with being named to Inc. Magazine’s 5000 Fastest Growing Companies. At Egen, we believe in nurturing a culture of innovation, respect and well-being.
Embrace inclusion and diversity
We believe a company is strengthened by the diversity of voices and lived experiences of its people. We have an inclusive culture empowered by hiring and career advancement practices to support a diverse workforce, coaching programs to help diverse individuals progress, and an apprenticeship program to attract diverse talent.
Continue your learning journey
We invest in your ability to scale up. We offer pathways for you to advance your skill sets and gain expertise. We work hard to be the place where you can always find new challenges, opportunities and experiences, so you can keep evolving your career while helping organizations and people unleash the power of data and platforms.
\n- Lead and develop passionate data engineering teams through complex data migrations from disparate legacy ETL platforms to modern and highly scalable distributed data platforms.
- Design and develop distributed ETL/ELT pipelines with cloud-native data stores. AWS Redshift and Snowflake preferred.
- Facilitate fast and efficient data migrations through a deep understanding of design, mapping, implementation, management, and support of distributed data pipelines.
- Prepare data mapping, data flow, production support, and pipeline documentation for all projects.
- Create and document end-to-end data warehouse and data mart implementation plans.
- Consult business, product, and data science teams to understand end-user requirements or analytics needs to implement the most appropriate data platform technology and scalable data engineering practices.
- Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
- You have extensive past experience with legacy Informatica and Hadoop based EDW platforms and have led or significantly contributed to a major cloud migration
- You are a subject matter expert of standard concepts, best practices, and procedures within an enterprise data warehousing environment
- You have a strong background in distributed data warehousing with AWS Redshift (Snowflake, Big Query, and/or Azure Data Warehouse would be ok).
- You have expert programming/scripting knowledge in building and managing ETL pipelines using SQL, Python, and Bash.