What are the responsibilities and job description for the Sr. Python Spark Developer (Onsite) position at Cognizant?
Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process outsourcing services, dedicated to helping the world's leading companies build stronger businesses. Headquartered in Teaneck, New Jersey (U.S.). Cognizant is a member of the NASDAQ-100, the S&P 500, the Forbes Global 1000, and the Fortune 500 and we are among the top performing and fastest growing companies in the world.
Practice - AIA - Artificial Intelligence and Analytics
About AI & Analytics: Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future - a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies. By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models. Cognizant’s AIA practice takes insights that are buried in data and provides businesses a clear way to transform how they source, interpret, and consume their information. Our clients need flexible data structures and a streamlined data architecture that quickly turns data resources into informative, meaningful intelligence.
Job Summary
We are seeking a Sr. Developer with 5 years of experience in Spark in Scala Apache Airflow Python and Databricks SQL. The ideal candidate will have a strong background in Asset Management Operations. This hybrid role requires the candidate to work during the day shift and does not require travel. The candidate will play a crucial role in developing and maintaining our data infrastructure ensuring optimal performance and reliability.
Responsibilities
Applications will be accepted until 05/04/2025
The annual salary for this position is between $90,000 – $ 123,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.
Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
#CB
#IND123
Qualifications
Certified Spark Developer Apache Airflow Certification Python Data Science Certification Databricks SQL Analyst Certification
Practice - AIA - Artificial Intelligence and Analytics
About AI & Analytics: Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future - a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies. By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models. Cognizant’s AIA practice takes insights that are buried in data and provides businesses a clear way to transform how they source, interpret, and consume their information. Our clients need flexible data structures and a streamlined data architecture that quickly turns data resources into informative, meaningful intelligence.
Job Summary
We are seeking a Sr. Developer with 5 years of experience in Spark in Scala Apache Airflow Python and Databricks SQL. The ideal candidate will have a strong background in Asset Management Operations. This hybrid role requires the candidate to work during the day shift and does not require travel. The candidate will play a crucial role in developing and maintaining our data infrastructure ensuring optimal performance and reliability.
Responsibilities
- Develop and maintain data pipelines using Spark in Scala to ensure efficient data processing and transformation.
- Implement and manage workflows using Apache Airflow to automate and schedule data tasks.
- Write and optimize complex SQL queries in Databricks SQL to support data analysis and reporting.
- Utilize Python to develop scripts and applications for data manipulation and integration.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.
- Monitor and troubleshoot data pipelines to ensure data quality and system reliability.
- Provide technical guidance and support to team members on best practices and industry standards.
- Conduct code reviews to ensure code quality and adherence to development standards.
- Stay updated with the latest technologies and trends in data engineering and asset management operations.
- Participate in design and architecture discussions to contribute to the overall data strategy.
- Document processes workflows and data models to maintain a comprehensive knowledge base.
- Ensure compliance with data governance and security policies.
- Contribute to continuous improvement initiatives to enhance the efficiency and effectiveness of data operations.
- Possess strong experience in Spark in Scala for developing and maintaining data pipelines.
- Have hands-on experience with Apache Airflow for workflow automation and scheduling.
- Demonstrate proficiency in writing and optimizing SQL queries in Databricks SQL.
- Exhibit expertise in Python for developing data manipulation and integration scripts.
- Show a solid understanding of asset management operations and related data requirements.
- Display excellent problem-solving skills and the ability to troubleshoot data issues effectively.
- Have strong communication skills to collaborate with cross-functional teams and stakeholders.
- Be detail-oriented with a focus on delivering high-quality solutions.
- Stay proactive in learning and adopting new technologies and best practices.
- Maintain a strong commitment to data governance and security standards.
- Be capable of working independently and managing multiple tasks in a hybrid work environment.
- Demonstrate the ability to document processes and maintain a comprehensive knowledge base.
- Show a proactive approach to continuous improvement and innovation in data operations.
Applications will be accepted until 05/04/2025
The annual salary for this position is between $90,000 – $ 123,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.
Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
- Medical/Dental/Vision/Life Insurance
- Paid holidays plus Paid Time Off
- 401(k) plan and contributions
- Long-term/Short-term Disability
- Paid Parental Leave
- Employee Stock Purchase Plan
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
#CB
#IND123
Qualifications
- Possess strong experience in Spark in Scala for developing and maintaining data pipelines.
- Have hands-on experience with Apache Airflow for workflow automation and scheduling.
- Demonstrate proficiency in writing and optimizing SQL queries in Databricks SQL.
- Exhibit expertise in Python for developing data manipulation and integration scripts.
- Show a solid understanding of asset management operations and related data requirements.
- Display excellent problem-solving skills and the ability to troubleshoot data issues effectively.
- Have strong communication skills to collaborate with cross-functional teams and stakeholders.
- Be detail-oriented with a focus on delivering high-quality solutions.
- Stay proactive in learning and adopting new technologies and best practices.
- Maintain a strong commitment to data governance and security standards.
- Be capable of working independently and managing multiple tasks in a hybrid work environment.
- Demonstrate the ability to document processes and maintain a comprehensive knowledge base.
- Show a proactive approach to continuous improvement and innovation in data operations.
Certified Spark Developer Apache Airflow Certification Python Data Science Certification Databricks SQL Analyst Certification
Salary : $90,000 - $123,000