Description:
The ideal candidate will be responsible for building and optimizing our data infrastructure, designing and maintaining scalable data pipelines to support analytics, business intelligence, and operational systems.Key Responsibilities:
Develop and optimize data pipelines for seamless data flow. Implement and maintain scalable data architectures. Collaborate with cross-functional teams to ensure data availability and quality. Requirements:
Bachelor's degree in Computer Science, IT, Engineering, or related field (or equivalent experience). 2-4 years of experience in data engineering, with hands-on experience in pipeline development and optimization. Strong proficiency in Python and SQL. Experience with ETL/ELT tools (e.g., Apache Airflow, Talend). Familiarity with cloud platforms (AWS, Azure, GCP). Knowledge of Big Data frameworks (Hadoop, Spark, ClickHouse). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills.If you're passionate about data and enjoy tackling complex challenges, apply now!
02 Mar 2025;
from:
gumtree.co.za