US Jobs US Jobs     UK Jobs UK Jobs     EU Jobs EU Jobs

   

Data Engineer, Python, AWS, Databricks

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.

As a Software Engineer III at JPMorgan Chase within the Commercial and Investment Bank, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way.

You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives.

Job Responsibilities


* Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics.


* Frequently utilizes SQL and understands NoSQL databases and their niche in the marketplace


* Implement best practices for data engineering, ensuring data quality, reliability, and performance


* Contribute to data modernization efforts by leveraging cloud solutions and optimizing data processing workflows


* Perform data extraction and implement complex data transformation logic to meet business requirements


* Monitor and executes data quality checks to proactively identify and address anomalies


* Ensure data availability and accuracy for analytical purposes


* Identify opportunities for process automation within data engineering workflows


* Deploy and manage containerized applications using Amazon ECS ( Kubernetes EKS) .


* Implement data orchestration and workflow automation using AWS step , Event Bridge


* Use Terraform for infrastructure provisioning and management, ensuring a robust and scalable data infrastructure.

Required qualifications, capabilities, and skills


* Formal training or certification on Software / Data Engineering concepts and 3+ years applied experience


* Experience across the data lifecycle


* Experience working with modern Lakehouse : Databricks , Glue )


* Proficient in SQL coding (e.g., joins and aggregations)


* Experience in Micro service based component using ECS or EKS


* Experience in building and optimizing data pipelines, architectures, and data sets ( Glue or Data bricks etl)


* Proficient in object-oriented and object function scripting languages (Python etc.)


* Experience in developing ETL process and workflows for streaming data from heterogeneous data sources ( Kafka)


* Experience building Pipeline on AWS using Terraform and using CI/CD pipelines

Preferred qualifications, capabilities, and skills


* Advanced knowledge of RDBMS like Aurora , Open Search


* Experience with data pipeline and workflow management tools (Airflow, etc.)


* Strong analytical and problem-solving skills, with attention to detail.


* Ability to work independently and collaboratively in a team environment.


* A proactive approach to learning and adapting to new technologies and methodologies.

JPMorganChase, one of the oldest financial institutions, offers innova...




Share Job