US Jobs US Jobs     UK Jobs UK Jobs     EU Jobs EU Jobs


AWS Big Data Lead Software Engineer

Job responsibilities


* Design & build new applications utilizing leading edge technologies and modernize existing applications


* Implement batch & real-time software components consistent with architectural best-practices of reliability, security, operational efficiency, cost-effectiveness and performance


* Ensure quality of deployed code via automated unit, integration & acceptance testing


* Collaborate with multi-national agile development, support and business teams to meet sprint objectives


* Participate in all agile meetings & rituals, including daily standups, sprint planning, backlog reviews, demos, and retrospectives


* Provide level 2 support for production systems


* Learn and applies system processes, methodologies, and skills for the development of secure, stable code and systems


* Hands on applicaitn development leveraging distributed compute such as Apache flink or Spark on very large datasets


* Design and development of applications that leverage the AWS infrastructure d eploying software components on AWS using common compute and storage services such as EC2, EKS, Lambda, S3


* Lead and deliver projects from concept to production across PNI (Personalization and Insights) platform

Required qualifications, capabilities, and skills


* Formal training or certification on software engineering concepts and 5+ years applied experience


* Hands-on practical experience in Frameworks, system design, application development, testing, and operational stability


* Experience with Apache Spark, Apache Flink or similar large-scale data processing engines


* Experience with Distributed Datastores (e.g.

Cassandra, Red Shift)


* Experience designing, developing and deploying software components on AWS using common compute and storage services such as EC2, EKS, Lambda, S3


* Experience with Big Data / Distributed / cloud technology (AWS Big data services like lambda, glue, glue emr, Performance tuning, Streaming, KAFKA, Entitlements etc., )


* Experience with Apache Spark, Ray or similar large-scale data processing engines


* Proficiency in automation and continuous delivery methods


* Proficient in all aspects of the Software Development Life Cycle


* Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security


* Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, BigData, artificial intelligence, machine learning, mobile, etc.)

Preferred qualifications, capabilities, and skills


* Experience building ETL/Feature processing pipelines


* Experience using workflow orchestration tools-Airflow, Kubeflow etc.


* Experience using Terraform to deploy infrastructure-as-code to public cloud


* Experience with Linux scripting such as Bash, KSH, or Python


* Certified AWS Cloud Practitioner, Developer or Solutions Architect strongly preferred

C...




Share Job