What are the responsibilities and job description for the AWS Software Engineer III, Big Data/Spark position at JPMorgan Chase?
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.
As an AWS Software Engineer III, Big Data/Spark at JPMorgan Chase within the Corporate Sector-Consumer and Community Banking Risk team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job responsibilities
- Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
- Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
- Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems
- Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
- Contributes to software engineering communities of practice and events that explore new and emerging technologies
- Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts, and 3 years applied experience
- Extensive hands-on experience in system design, application development, testing, and ensuring operational stability, with a strong focus on creating robust and efficient systems.
- Proficient in coding with multiple programming languages, including Java or Python, demonstrating the ability to write clean, efficient, and maintainable code.
- Previous professional experience utilizing AWS cloud technologies, including deploying and managing cloud-based applications and services.
- Skilled in developing, debugging, and maintaining code within a large corporate environment, using modern programming languages and database querying languages to ensure high-quality software solutions.
- Comprehensive understanding of the Software Development Life Cycle (SDLC), with the ability to navigate and contribute effectively at each stage.
- Solid grasp of agile methodologies, including Continuous Integration/Continuous Deployment (CI/CD), application resiliency, and security practices, to enhance software development and delivery processes.
- Demonstrated expertise in software applications and technical processes within specialized technical disciplines, such as cloud computing, artificial intelligence, machine learning, and mobile technologies.
- Hands-on experience with Apache Spark, focusing on developing and optimizing large-scale data processing applications, as well as managing and deploying infrastructure using Terraform to automate and streamline cloud resource management.
Preferred qualifications, capabilities, and skills
- Familiarity with Snowflake and Open Table formats like Iceberg, Delta Lake
- AWS Cloud Certification will be plus
- Experience with AWS services such as S3, EMR, Lambda, and Redshift.
- Strong analytical and problem-solving skills with a focus on data-driven decision-making.