Description:
The successful candidate will be responsible for building, optimizing, and maintaining complex data pipelines and databases. They will collaborate with data scientists and analysts to ensure seamless data accessibility and quality while developing ETL processes and scalable storage solutions. The role involves implementing best practices for data security, compliance, and performance optimization, as well as mentoring junior engineers and driving innovation in data architecture.
Skills & Experience:
Strong experience in Python, Java, or Scala for data engineering
Expertise in SQL and database optimization techniques
Experience with cloud-based data platforms (AWS, Azure, or GCP)
Proficiency in ETL tools and frameworks (e.g., Apache Airflow, Apache Spark)
Knowledge of data warehousing solutions (Snowflake, Redshift, BigQuery)
Experience working with big data technologies such as Hadoop and Flink
Strong problem-solving skills and ability to design large-scale data solutions
Qualification:
A bachelors degree in Computer Science, Engineering, or a related field is advantageous
Contact Mmiselo Dlephu on
14 Mar 2025;
from:
gumtree.co.za