US Jobs US Jobs     UK Jobs UK Jobs     EU Jobs EU Jobs


Software Engineer III - PySpark/AWS

We have an exciting and rewarding opportunity for you to take your data engineering career to the next level.

As an Software Engineer III - PySpark/AWS at JPMorganChase within the Corporate Sector-Global Finance team, you will be a key member of an agile team, responsible for building and delivering AI-enabled data products that are secure, stable, and scalable.

In this role, you will develop data infrastructure, tool integrations, and retrieval systems that enable AI agents to access, interpret, and act on enterprise data in support of the firm's business goals.

You will work alongside senior engineers, grow your expertise in agentic AI data engineering, and contribute to a culture of engineering excellence.

Job Responsibilities


* Building and optimizing data pipelines and workflows that serve as the backbone for agentic AI systems, ensuring agents have reliable, real-time access to high-quality, structured and unstructured data


* Developing data retrieval and indexing layers that enable AI agents to autonomously search, query, and synthesize information across multiple data sources


* Building and maintaining tool-use infrastructure - APIs, data services, and function endpoints - that AI agents invoke to execute tasks, retrieve data, and interact with enterprise systems


* Implementing and enforcing best practices for data management, ensuring data quality, security, and compliance, including governance of data consumed and generated by autonomous AI agents


* Hands-on development of secure, high-quality production code following AWS best practices, and deploying efficiently using CI/CD pipelines;

Building orchestration and state management layers that support multi-step agent workflows, including memory, context persistence, and task chaining


* Writing and reviewing code daily, conducting thorough code reviews, and raising the technical bar across the team;

Mentoring and guiding junior and mid-level engineers through pairing, code reviews, and technical coaching


* Collaborating with product owners, data scientists, and business stakeholders to translate business requirements into working, production-ready agentic AI solutions;

Evaluating and adopting emerging agentic AI frameworks, tools, and data engineering practices to continuously improve the team's development capabilities

Required Qualifications, Capabilities, and Skills


* Formal training or certification on software engineering concepts and 3+ years applied experience


* Expert-level programming skills in Python/PySpark with a strong portfolio of production-grade code


* Extensive hands-on experience with Databricks and the AWS cloud ecosystem, including AWS Glue, S3, SQS/SNS, Lambda


* Deep expertise with Spark and SQL


* Strong hands-on experience with Lakehouse/Delta Lake architecture, application development, testing, and ensuring operational stability; Snowflake, Terraform and LLMs; Data Observabilit...




Share Job