What are the responsibilities and job description for the Data Engineer position at Ascendion?
About Ascendion
Ascendion is a full-service digital engineering solutions company. We make and manage software platforms and products that power growth and deliver captivating experiences to consumers and employees. Our engineering, cloud, data, experience design, and talent solution capabilities accelerate transformation and impact for enterprise clients. Headquartered in New Jersey, our workforce of 6,000 Ascenders delivers solutions from around the globe. Ascendion is built differently to engineer the next.
Ascendion | Engineering to elevate life
We have a culture built on opportunity, inclusion, and a spirit of partnership. Come, change the world with us:
- Build the coolest tech for world’s leading brands
- Solve complex problems - and learn new skills
- Experience the power of transforming digital engineering for Fortune 500 clients
- Master your craft with leading training programs and hands-on experience
Experience a community of change makers!
Join a culture of high-performing innovators with endless ideas and a passion for tech. Our culture is the fabric of our company, and it is what makes us unique and diverse. The way we share ideas, learning, experiences, successes, and joy allows everyone to be their best at Ascendion.
About the Role:
Job Description : Senior GCP Data Engineer
Overview:
We are seeking a highly skilled and experienced GCP Data Engineer to join our team. The ideal candidate will have a strong background in developing streaming & batch pipelines in GCP. You will be responsible for designing, implementing, and optimizing data solutions on GCP processing huge volumes of data & making real-time decisions.
Key Responsibilities:
- Design and implement a scalable data pipeline in GCP using GCP Big Query, Dataproc, PubSub/Kafka and Dataflow.
- Should be able to build pipelines to transform large volumes of data using pyspark
- Should be able to work with different types of data sources
- Should have a minimum of 4 years of experience in Python or any scripting language and SQL
- Drive Technical Strategy of the data pipelines and mentor junior resources
- Perform Code Reviews to ensure best practices are adhered to
Qualifications:
Required
- Bachelor's degree in Computer Science, Information Technology, or equivalent experience
- 7 years of IT experience and 4 years in developing highly scalable data pipelines in GCP
- Strong scripting skills in one or more languages (Python, Unix etc)
- Strong problem-solving skills and attention to detail
- Need to have strong Banking & Financial Services domain experience
Preferred
- Google Cloud certified (Professional Data Engineer)
- Experience with multi-cloud environments
- Experience with CI/CD tools (Cloud Build, Jenkins, Github Actions)
Location : Phoenix AZ (Hybrid)
Salary Range: The salary for this position is between $120,000 to $140,000 annually. Factors which may affect pay within this range may include geography/market, skills, education, experience, and other qualifications of the successful candidate.
Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: [medical insurance] [dental insurance] [vision insurance] [401(k) retirement plan] [long-term disability insurance] [short-term disability insurance] [5 personal days accrued each calendar year. The Paid time off benefits meet the paid sick and safe time laws that pertains to the City/ State] [10-15 days of paid vacation time] [6 paid holidays and 1 floating holiday per calendar year] [Ascendion Learning Management System]
Want to change the world? Let us know.
Tell us about your experiences, education, and ambitions. Bring your knowledge, unique viewpoint, and creativity to the table.
Let’s talk!
Salary : $120,000 - $140,000