US Jobs US Jobs     UK Jobs UK Jobs     EU Jobs EU Jobs


Senior, Data Engineer

We are your Energy Technology Partner.

We electrify, automate, and digitalize every industry, business, and home, driving efficiency and sustainability for all.

At Schneider Electric, our values - IMPACT (Inclusion, Mastery, Purpose, Action, Curiosity, Teamwork) - are the foundation of everything we do.

Becoming an Impact Maker means turning sustainability ambitions into actions at the intersection of automation, electrification, and digitization.

Are you ready to lead the digital transformation to create a more sustainable world?

If you are up to challenge your creativity and make an impact, we are excited to welcome you!

Schneider Digital is the digital department of Schneider Electric, leading the digital transformation in the company by giving support globally to our internal teams and our clients.

Schneider Digital consists of 6 Digital Hubs worldwide which are strategically located to ensure a 24/7 support across the company (France, China, India, USA, Mexico and Spain).

Our Digital Hub in Barcelona is formed by +450 employees working in strategic projects and different roles such as Data, Cybersecurity, ERP, Cloud, Infrastructures, IT Project Management or Digital Marketing.

Barcelona Digital Technology Center is part of Schneider Digital: enabling Schneider-Electric digital transformation by delivering Business Requirements.

We are looking for a skilled Data Integration & Data Warehousing Engineer to design, build, and manage scalable data pipelines and enterprise data warehouse solutions using AWS Glue , AWS Lambda and Informatica.

This role will focus on enabling reliable data ingestion, transformation, modelling, and delivery across the organization while ensuring performance, quality, governance, and scalability.

The ideal candidate should have strong experience in ETL/ELT frameworks, data warehousing design, advanced SQL development, and cloud-based data engineering.

Key Responsibilities:

Data Integration & Pipeline Development



* Design, develop, and maintain scalable data pipelines using Python, AWS Glue, and Informatica PowerCenter



* Build robust ETL/ELT workflows for ingesting structured data from multiple sources (databases, APIs, files, SaaS systems)



* Develop reusable ingestion frameworks to support data migration from on-prem to cloud

Data Warehousing & Modelling



* Develop and maintain dimensional models (fact/dimension tables, SCD handling)



* Optimize data models for analytics and reporting performance

Cloud Data Engineering



* Develop and manage AWS Glue jobs using PySpark / Python



* Build and orchestrate workflows across AWS ecosystem (Redshift, Lambda)



* Ensure efficient compute and storage utilization for performance and cost optimization

Monitoring & Operations



* Monitor ETL pipelines and job performance



* Troubleshoot data failures and performance bottlenecks



* Implement logging, alerting, and recovery mechanisms

Data Engineering & Process...




Share Job