What are the responsibilities and job description for the Software Engineer III (Python/UI) position at JPMorgan Chase?
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.
As a Software Engineer III at JPMorgan Chase within the Corporate Data & Analytics, Platform Engineering team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job responsibilities
- Develop and deploy data platform solutions primarily using Python engineering.
- Develop and optimize data processing workflows.
- Implement strong object-oriented programming principles to create robust and maintainable code.
- Ensure adherence to coding standards and best practices, including comprehensive test coverage and documentation.
- Collaborate with product managers, architects, and other engineers to deliver high-quality software solutions.
- Troubleshoot and resolve technical issues across the development and production environments.
- Continuously improve software development processes and contribute to the team’s knowledge base.
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 3 years applied experience
- Strong software engineering and object-oriented programming skills with expertise in languages such as Python and Java/Scala
- Familiar with development tools such as Jenkins, Jira, Git/Stash
- Automated testing integration utilizing Junit, Cucumber or similar
- Strategic thinking and passion for business strategy and business processes.
- Excellent interpersonal skills and ability to communicate with clarity, brevity and tailor message to a technical or business audience.
- Strong attention to detail
Preferred qualifications, capabilities, and skills
- Experience in working with AWS (Lambda, Step Function, SQS, SNS, API Gateway)
- Experience with Databricks and Apache Spark for big data processing
- Hands on experience with open source frameworks/libraries, such as Apache Airflow
- Pyspark, Scala, hands on UI experience