US Jobs US Jobs     UK Jobs UK Jobs     EU Jobs EU Jobs

   

Data Quality Engineer

Your Job

Georgia Pacific's Enterprise Data and Predictive Analytics Operations team is looking for a disciplined and motivated Data Engineer to drive stability and solutions for analytics and remote engineers.

This is an exciting opportunity to be on the front line of innovative work and build systems to support a wide array of needs.

Location: Atlanta, GA Hybrid (onsite three days per week)

Our Team

Georgia-Pacific (GP) is among the world's leading manufacturers of bath tissue, paper towels, napkins, tableware, paper-based packaging, office papers, cellulose, specialty fibers, nonwoven fabrics, building products and related chemicals.

Our building products business makes DensGlass® gypsum board often seen in commercial construction, DryPly® plywood and RESI-MIX® wood adhesives, among others.

Our containerboard and packaging business offers high-end graphic packaging to bulk bins as well as Golden Isles fluff pulp.

You may also recognize consumer brands like Angel Soft®, Brawny®, and Dixie® on retail shelves and enMotion® towels, Compact® bath tissue and SmartStock® cutlery dispensers when you are away from home.

Our GP Harmon business is one of the world's largest recyclers of paper, metal and plastics.

As a Koch Company, we create long-term value using resources efficiently to provide innovative products and solutions that meet the needs of customers and society, while operating in a manner that is environmentally and socially responsible, and economically sound.

Headquartered in Atlanta, GA., we employ approximately 35,000 people.

For more information, visit www.gp.com .

To learn more about our culture, Market-Based Management (MBM®), click here:

http://www.gp.com/aboutus/MBM/index.html
http://www.kochind.com/MBM

What You Will Do



* Managing AWS infrastructure per the needs of the business.

This includes servers for tools like SAS, Dynatrace, Databases likes Redshift and AWS RDS.


* Troubleshooting infrastructure, network, and capacity issues.


* Daily support of AWS data lake environment which includes data pipeline management, troubleshooting and root cause analysis with the use of tools like python, docker-container and SQL


* Developing dev ops terraform codes to deploy pipelines


* Developing Failure mode detection capabilities on all Analytics and Data pipelines and then Monitoring & maintaining large number of data pipelines using tools like Dynatrace, Splunk etc.


* Security Administration on AWS and Advanced Analytics


* Managing serverless resources like Lambda, ecs, glue and containers on EKS and Fargate.


* Work on pipelines using Dynatrace/ Splunk


* Interface between analytics, engineering, and operations resources


* Work independently or as part of a team to create and execute transformational strategies


* Work on Generative AI model evaluation jobs and automation of different use cases

Who You Are (Basic Qualifications)



* Bachelor's degree in Engineering ...




Share Job