US Jobs US Jobs     UK Jobs UK Jobs     EU Jobs EU Jobs

   

Data Cloud Expert

Job description

Schneider Electric is looking for AWS data cloud engineer with min experience of 5 years in AWS data lake implementation.

Responsible for creating/managing data ingestion, transformation, making data ready for consumption in analytical layer of data lake.

Also responsible for managing/monitoring data quality of data lake using informatica power center.

Also responsible for creating dashboards from analytical layer of data lake using Tableau or Power BI.

Your Role

We are looking for strong AWS Data Engineers who are passionate about Cloud technology.

Your responsibilities are:


* Design and Develop Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting.


* Implement ETL/ELT Processes: Develop Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses, Data Lakes, and Lake Houses using Open Source and AWS tools.


* Implement data quality rules, perform data profiling to assess the source data quality, identify data anomalies, and create data quality scorecards using Informatica PowerCenter.


* Design Data Solutions: Leverage your analytical skills to design innovative data solutions that address complex business requirements and drive decision-making.


* Interact with product owners to understand the needs of data ingestion, data quality rules.


* Adopt DevOps Practices: Utilize DevOps methodologies and tools for continuous integration and deployment (CI/CD), infrastructure as code (IaC), and automation to streamline and enhance our data engineering processes.

Optional skill.

Qualifications

Your Skills and Experience


* Min of 3 to 5 years of experience in AWS Data Lake implementation.


* Min of 2 to 3 years of knowledge in Informatica PowerCenter.


* Proficiency with AWS Tools: Demonstrable experience using AWS Glue, AWS Lambda, Amazon Kinesis, Amazon EMR, Amazon Athena, Amazon DynamoDB, Amazon Cloudwatch, Amazon SNS and AWS Step Functions.


* Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow.


* • Understanding of relational databases like Oracle, SQL Server, MySQL


* Programming Skills: Strong experience with modern programming languages such as Python and Java.


* Expertise in Data Storage Technologies: In-depth knowledge of Data Warehouse, Database technologies, and Big Data Eco-system technologies such as AWS Redshift, AWS RDS, and Hadoop.


* Experience with AWS Data Lakes: Proven experience working with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets.


* Expertise in developing Business Intelligence dashboards in Tableau, Power BI is a plus.


* Good knowledge on project and portfolio management suite of tools is a plus.


* Should be well versed with Agile principle of implementation.


* Having Safe Agile principles is a plus.

Abo...




Share Job