Description:
A client is looking for a highly skilled and experienced Senior Data Engineer to join their team based in Johannesburg. The ideal candidate will have extensive experience in data engineering, big data technologies, cloud platforms (particularly AWS), and a deep understanding of data modelling. This is an exciting opportunity to work on large-scale data infrastructure projects and drive innovation within the company. Key Responsibilities:- Architect and build highly scalable distributed systems using open-source tools.
- Design and implement efficient data models, with a deep understanding of various data structures and their benefits and limitations.
- Work with big data batch and streaming tools to process and analyze data at scale.
- Develop and optimize Extract, Transform, Load (ETL) processes for data pipelines.
- Utilize AWS technologies (EMR, EC2, S3) for data storage and processing.
- Write and maintain clean, efficient code in Python, PySpark, or Spark.
- Collaborate with cross-functional teams to define data engineering requirements and deliver technical solutions.
- Ensure the smooth operation, performance, and scalability of data infrastructures.
- Continuously improve the reliability and security of data systems.
- Bachelor’s Degree in Computer Science, Computer Engineering, or a related field, or equivalent experience.
- AWS Certification (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics).
- Extensive knowledge of programming or scripting languages (Python, Spark, etc.).
- Expert knowledge of data modelling and strong understanding of various data structures.
- Proven ability to architect highly scalable distributed systems using open-source tools.
- 5+ years of experience in Data Engineering or Software Engineering.
- 2+ years of experience with Big Data technologies.
- 2+ years of experience with ETL processes.
- 2+ years of experience working with AWS services (EMR, EC2, S3).
- 5+ years of experience with object-oriented design, coding, and testing patterns.
- Hands-on experience with Talend for data integration.
- Strong experience with Big Data batch and streaming tools.
- Solid background in developing commercial or open-source software platforms and large-scale data infrastructures.
For more IT jobs, please visit www.networkrecruitment.co.za
If you have not had any response in two weeks, please consider the vacancy application unsuccessful. Your profile will be kept on our database for any other suitable roles/positions.
For more information contact:
Reinie Du Preez
Senior Specialist Recruitment Consultant
E-mail: rdpreez@networkrecruitment.co.za
Requirements:
- Architect and build highly scalable distributed systems using open-source tools.
- Design and implement efficient data models, with a deep understanding of various data structures and their benefits and limitations.
- Work with big data batch and streaming tools to process and analyze data at scale.
- Develop and optimize Extract, Transform, Load (ETL) processes for data pipelines.
- Utilize AWS technologies (EMR, EC2, S3) for data storage and processing.
- Write and maintain clean, efficient code in Python, PySpark, or Spark.
- Collaborate with cross-functional teams to define data engineering requirements and deliver technical solutions.
- Ensure the smooth operation, performance, and scalability of data infrastructures.
- Continuously improve the reliability and security of data systems.
- Bachelor’s Degree in Computer Science, Computer Engineering, or a related field, or equivalent experience.
- AWS Certification (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics).
- Extensive knowledge of programming or scripting languages (Python, Spark, etc.).
- Expert knowledge of data modelling and strong understanding of various data structures.
- Proven ability to architect highly scalable distributed systems using open-source tools.
- 5+ years of experience in Data Engineering or Software Engineering.
- 2+ years of experience with Big Data technologies.
- 2+ years of experience with ETL processes.
- 2+ years of experience working with AWS services (EMR, EC2, S3).
- 5+ years of experience with object-oriented design, coding, and testing patterns.
- Hands-on experience with Talend for data integration.
- Strong experience with Big Data batch and streaming tools.
- Solid background in developing commercial or open-source software platforms and large-scale data infrastructures.
27 Mar 2025;
from:
careers24.com