Description:
Our client is looking for a highly motivated and experienced AWS Data Engineer to join their Global Markets division. In this role, you will work closely with Traders, Quantitative Analysts, and Data Scientists to design, develop, and maintain data infrastructure and pipelines for their global markets business. Minimum Experience:- 5 years of experience in Python/C# development
- 3 years of experience in AWS data engineering
- Bachelor’s Degree in Computer Science, Information Systems, or a related field
- AWS Certified Machine Learning – Specialty Certificate
- Design and create data models that can extract information from various sources and store it in a usable format
- Lead the design, implementation, and successful delivery of large-scale, complex data solutions
- Utilize expertise in SQL, ETL, and data modeling to create robust data pipelines
- Ingest data into AWS S3 and perform ETL into RDS or Redshift
- Use AWS Lambda (C# or Python) for event-driven data transformations
- Design and implement security measures to protect data from unauthorized access or misuse
- Ensure data integrity by designing backup and recovery procedures
- Automate the migration process in AWS from development to production
- Deliver actionable and digestible data content to support business decisions
- Be involved in the full spectrum of data engineering, from planning and estimation to architecture, pipeline design, delivery, and production implementation
- Design and implement complex data solutions, from batch to streaming and event-driven architecture across cloud, on-premise, and hybrid technology landscapes
- Optimize cloud workloads for cost, scalability, availability, governance, and compliance
Qualifications and Competencies:
- Experience with AWS Glue Jobs using PySpark or AWS Glue Spark
- Real-time ingestion using KAFKA is an added advantage
- Strong SQL and C# or Python programming skills
- Object-oriented principles in C# or Python: classes, inheritance
- Expert knowledge of data engineering packages and libraries in C# or Python
- AWS technical certifications (Developer Associate or Solutions Architect)
- Experience with the development and delivery of microservices using serverless AWS services (S3, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM)
- Ability to articulate technical and non-technical requirements to different audiences
- Experience with RDBMS databases such as Postgres, SQL Server, and MySQL
- Proficiency in scripting and automation using tools like PowerShell, Python, Bash, Ruby, Perl, etc.
- Strong stakeholder management and communication skills, including problem-solving and relationship-building
- Ability to troubleshoot data issues efficiently and effectively
- Extensive experience in SDLC delivery, including waterfall, hybrid, and Agile methodologies
- Experience in implementing and delivering data solutions and pipelines on the AWS cloud platform
- Strong understanding of data modeling, data structures, databases, and ETL processes
- In-depth understanding of large-scale data sets, including both structured and unstructured data
- Experience in delivering CI/CD and DevOps capabilities in a data environment
- Ability to communicate complex technical ideas clearly
- Experience in the financial industry is a plus
- AWS Certified Machine Learning – Specialty Certificate is an advantage
For more IT jobs, please visit www.networkrecruitment.co.za
If you have not had any response in two weeks, please consider the vacancy application unsuccessful. Your profile will be kept on our database for any other suitable roles/positions.
For more information contact:
Reinie Du Preez
Senior Specialist Recruitment Consultant
E-mail: rdpreez@networkrecruitment.co.za
Requirements:
- 5 years of experience in Python/C# development
- 3 years of experience in AWS data engineering
- Bachelor’s Degree in Computer Science, Information Systems, or a related field
- AWS Certified Machine Learning – Specialty Certificate
- Design and create data models that can extract information from various sources and store it in a usable format
- Lead the design, implementation, and successful delivery of large-scale, complex data solutions
- Utilize expertise in SQL, ETL, and data modeling to create robust data pipelines
- Ingest data into AWS S3 and perform ETL into RDS or Redshift
- Use AWS Lambda (C# or Python) for event-driven data transformations
- Design and implement security measures to protect data from unauthorized access or misuse
- Ensure data integrity by designing backup and recovery procedures
- Automate the migration process in AWS from development to production
- Deliver actionable and digestible data content to support business decisions
- Be involved in the full spectrum of data engineering, from planning and estimation to architecture, pipeline design, delivery, and production implementation
- Design and implement complex data solutions, from batch to streaming and event-driven architecture across cloud, on-premise, and hybrid technology landscapes
- Optimize cloud workloads for cost, scalability, availability, governance, and compliance
- Experience with AWS Glue Jobs using PySpark or AWS Glue Spark
- Real-time ingestion using KAFKA is an added advantage
- Strong SQL and C# or Python programming skills
- Object-oriented principles in C# or Python: classes, inheritance
- Expert knowledge of data engineering packages and libraries in C# or Python
- AWS technical certifications (Developer Associate or Solutions Architect)
- Experience with the development and delivery of microservices using serverless AWS services (S3, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM)
- Ability to articulate technical and non-technical requirements to different audiences
- Experience with RDBMS databases such as Postgres, SQL Server, and MySQL
- Proficiency in scripting and automation using tools like PowerShell, Python, Bash, Ruby, Perl, etc.
- Strong stakeholder management and communication skills, including problem-solving and relationship-building
- Ability to troubleshoot data issues efficiently and effectively
- Extensive experience in SDLC delivery, including waterfall, hybrid, and Agile methodologies
- Experience in implementing and delivering data solutions and pipelines on the AWS cloud platform
- Strong understanding of data modeling, data structures, databases, and ETL processes
- In-depth understanding of large-scale data sets, including both structured and unstructured data
- Experience in delivering CI/CD and DevOps capabilities in a data environment
- Ability to communicate complex technical ideas clearly
- Experience in the financial industry is a plus
- AWS Certified Machine Learning – Specialty Certificate is an advantage
04 Apr 2025;
from:
careers24.com