Description:
Leadership:
• Demonstrated ability to act as a tech lead, guiding the design, development, and implementation of data solutions while ensuring alignment with best practices and the company and its client’s objectives.
• Skilled in mentoring and supporting team members, fostering a culture of collaboration, innovation, and continuous improvement.
• Strong technical decision-making abilities, prioritisation of tasks, and managing resources to meet project deadlines.
Collaboration Across Teams:
• Proven ability to build strong partnerships with product, technology, and business stakeholders, actively listening to understand their needs and challenges and adjusting approach and thinking accordingly.
Personal Growth Mindset, Problem-Solving & Adaptability:
• Adopt an AI-led development mindset by integrating AI tools and automation into the software development lifecycle, leveraging AI for code generation, testing, debugging, and optimisation to enhance productivity.
• Deep interest in learning and adopting emerging technologies, particularly in the rapidly evolving AI and data landscape.
• Proactive approach to problem solving, solution design and the development process as well as associated challenges.
• Enjoy working in a fast paced and changing environment, with the ability to react and adapt quickly to changes.
Communication:
• Ability to convey technical insights to both technical and non-technical stakeholders clearly and effectively.
Technical Skills
Data Architecture & Engineering:
• Data architecture experience, designing scalable, high-performance data platforms to support data analytics, AI/ML data pipelines and batch/real-time data processing.
• Expertise in defining data modelling strategies, data governance, and security best practices to ensure efficient data flow across systems.
Database Management and Optimisation:
• Proficiency in managing Snowflake, PostgreSQL or similar tools, including schema design, query optimisation, and performance tuning.
• Experience with vector databases for integrating vector embeddings.
Data Pipelines and Integration:
• Expertise in creating and managing ETL/ELT pipelines to handle large-scale data ingestion and transformation.
• Familiarity with DBT or similar tools.
Automation, AI/ML Models, Agents:
• Hands-on experience deploying, fine-tuning, and integrating AI models (e.g., transformers, embeddings) into data platforms.
• Knowledge of LangChain and or other frameworks.
• Experience implementing AI agents for data collection and process automation.
• Experience in retrieval-augmented generation (RAG) workflows.
Cloud and DevOps:
• Understanding of cloud services with a focus on d