Software Engineer III - Databricks, ETL
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.
As a Software Engineer III at JPMorgan Chase within the Commercial and Investment Bank, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way.
You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives.
Job Responsibilities
* Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics.
* Frequently utilizes SQL and understands NoSQL databases and their niche in the marketplace
* Implement best practices for data engineering, ensuring data quality, reliability, and performance
* Contribute to data modernization efforts by leveraging cloud solutions and optimizing data processing workflows
* Perform data extraction and implement complex data transformation logic to meet business requirements
* Monitor and executes data quality checks to proactively identify and address anomalies
* Ensure data availability and accuracy for analytical purposes
* Identify opportunities for process automation within data engineering workflows
* Deploy and manage containerized applications using Amazon ECS ( Kubernetes EKS) .
* Implement data orchestration and workflow automation using AWS step , Event Bridge
* Use Terraform for infrastructure provisioning and management, ensuring a robust and scalable data infrastructure.
Required qualifications, capabilities, and skills
* Formal training or certification on Software / Data Engineering concepts and 3+ years applied experience
* Experience across the data lifecycle
* Experience working with modern Lakehouse : Databricks , Glue )
* Proficient in SQL coding (e.g., joins and aggregations)
* Experience in Micro service based component using ECS or EKS
* Experience in building and optimizing data pipelines, architectures, and data sets ( Glue or Data bricks etl)
* Proficient in object-oriented and object function scripting languages (Python etc.)
* Experience in developing ETL process and workflows for streaming data from heterogeneous data sources ( Kafka)
* Experience building Pipeline on AWS using Terraform and using CI/CD pipelines
Preferred qualifications, capabilities, and skills
* Advanced knowledge of RDBMS like Aurora , Open Search
* Experience with data pipeline and workflow management tools (Airflow, etc.)
* Strong analytical and problem-solving skills, with attention to detail.
* Ability to work independently and collaboratively in a team environment.
* A proactive approach to learning and adapting to new technologies and methodologies.
JPMorganChase, one of the oldest financial institutions, offers innova...
- Rate: Not Specified
- Location: Chicago, US-IL
- Type: Permanent
- Industry: Finance
- Recruiter: JPMorgan Chase Bank, N.A.
- Contact: Not Specified
- Email: to view click here
- Reference: 210659910
- Posted: 2025-08-23 08:58:43 -
- View all Jobs from JPMorgan Chase Bank, N.A.
More Jobs from JPMorgan Chase Bank, N.A.
- Production Forman - Gypsum
- National Accounts Sales Representative
- Ironworker II
- Quality Manager
- Product Owner
- Postbote für Pakete und Briefe - Aushilfe (m/w/d) in Hagenbach (kein Minijob)
- TAD Manufacturing Process Engineer
- TAD Manufacturing Process Engineer
- TAD Manufacturing Process Engineer
- TAD Manufacturing Process Engineer
- TAD Manufacturing Process Engineer
- Converting Supervisor
- TAD Manufacturing Process Engineer
- Behavioral Health Inpatient Care Manager - Evernorth - Remote, CA
- Express Scripts Summer Intern - 2026
- Operations Senior Advisor - Express Scripts - Remote
- Cloud Engineering Senior Advisors- Hybrid
- Escalation and Real Time Support Specialist - Remote
- Financial Leadership Development Program - Summer Intern (On-Site)
- Senior Manager, National Client Service - Remote - Cigna Healthcare