Join our team as a DataOps Engineer (Volunteer) to manage and optimize data workflows, ensuring seamless data integration and supporting the analytical needs of the organization. As a key member of the Data Engineering division, you will focus on data pipeline engineering, cloud lakehouse management, and collaboration with the Data Governance team. With a strong technical foundation in SQL and Python, you will lead the engineering sub-division and mentor junior team members.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Position: DataOps Engineer [Volunteer]
Location: South Africa
Role Type: Volunteering | Part-time (3 - 4 hours a week) | Fully Remote
Start Date: Immediately
Company Description
Dubs Mathematics (Non-Profit Company) - is a dynamic and forward-thinking organization dedicated to tutor and mentor grade 12’s and 11’s, with the aim of fostering conceptual understanding in subjects taught. Our organization provides online tutoring services, in person winter classes, and Career Expos to create a supportive learning environment. We are committed to revolutionizing education and empowering learners from underprivileged communities. Through our innovative free WhatsApp tutoring platform, we have addressed accessibility barriers by bringing quality education directly to learners' fingertips.Â
Since our inception in 2019, we have reached over 1500 learners from various provinces across South Africa. "The next step towards infinity" is our powerful slogan, encapsulating our goal to propel learners forward on their mathematical path. We believe that mathematics is a continuous journey, and with each step, learners can reach new heights of knowledge and achievement. Our programs and support are designed to guide learners on this journey, ensuring they progress towards infinity, where possibilities are limitless!Â
See below our Media featuring:
News24 article - https://www.news24.com/parent/learn/their-hope-to-escape-poverty-is-througheducation-how-a-local-npo-tutored-over-240-students-using-whatsapp-20210805
Role Description
This is a remote volunteer position for a DataOps Engineer. The role involves managing and optimizing data workflows, ensuring seamless data integration, and supporting the analytical needs of the organization. Responsibilities include maintaining data pipelines, enhancing database performance, automating workflows, and supporting collaborative efforts to meet data management goals. This role plays a vital part in improving and maintaining operational efficiency, fostering scalability, and enabling insights to support Dub's mission.
As a DataOps Engineer in the Data Engineering division within the D&A Team you are the "engine room" of the organization. While the Architect designs the blueprint and the Engineer builds the pipes, you focus on the automation and reliability of the entire lifecycle. You ensure that our data environment is stable, automated, and scalable. Working closely with the Engineering Team.
Duties and Responsibilities:
1.    Data Pipeline Engineering (ELT/ETL):
·       Design, build, and maintain scalable data pipelines to ingest data from various sources (e-learning platform, social media and internal cloud apps) into our data lakehouse.
·       Implement and version control automated data transformation workflows to convert raw data into analytics-ready models.
·       Ensure data lineage is well-documented, mapping the flow from production platforms to BI tool
·       Tools: SQL, Python, dbt, Airflow/Prefect (Orchestration), Cloud Storage (AWS S3/GCS).
2.    Cloud Lakehouse Management:
·       Support the governance and maintenance of our modern data lakehouse architecture.
·       Optimize storage and compute performance to ensure efficient data processing and cost-management.
·       Help manage role-based access controls (RBAC), ensuring that only authorized volunteers and staff can access sensitive learner information.
·       Tools: Cloud Lakehouse, DuckDB
Interested in remote work opportunities in Devops? Discover Devops Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
3.    Collaboration & Governance:
·       Work closely with Data Governance (DG) to implement data quality rules, naming conventions, and access controls.
·       Ensure all engineering practices adhere to the Protection of Personal Information Act (POPIA).
·       Support the Business Intelligence (BI) team data requests.
·       Tools: SQL, dbt tests, GitHub, service desk tools (Jira service management)
4.    Automated Orchestration & CI/CD:
·       Build and maintain automated deployment pipelines (CI/CD) to move data code (SQL/dbt/Python) from development to production safely. Define and apply the naming conventions and standards for all tables within the cloud lakehouse.
·       Ensure that every update to our sources platform data flows are tested and validated before going live.
·       Tools: GitHub Actions, Python, Docker, dbt.
5.    Infrastructure Reliability & Governance:
·       Access Management & Privileges: Serve as the primary System Admin for the data stack; managing user roles, permissions, and security groups to ensure the principle of least privilege.
·       Scaling & Performance: Implement automated scaling and performance monitoring to handle peak traffic periods during learner exams.
·       Documentation & Audit: Maintain detailed access changelog and technical documentation for infrastructure changes, ensuring all admin actions are traceable and compliant.
·       Support the Business Intelligence (BI) team data requests.
·       Tools: Cloud Lakehouse, DuckDB, dbt, IAM (Identity & Access Management), service desk tools (Jira service management)
6.    Data Quality Operations & Observability:
·       Automate data quality tests directly into the pipelines.
·       Build observability dashboards to track "data freshness"—ensuring that the BI team always has the latest tutoring stats.
·       Tools: dbt tests, Great Expectations, SQL, Monitoring Dashboards.
7.    Technical Leadership & Mentorship
·       Act as the lead for the engineering sub-divison, providing guidance and code reviews for team.
·       Mentor junior team members on best practices in version control, automation, and system reliability
Required Skills & Competencies:
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
•       Leadership & Mentorship: A desire to lead a small team, share knowledge, and ensure technical excellence across the division.
•       Security & Admin Mindset: High level of responsibility regarding access privileges and maintaining audit trails/documentation.
•       Automation Mindset: A passion for "deleting manual work" through Python scripting and CI/CD.
•       Technical Foundation: Strong proficiency in SQL and Python for managing system workflows.
Required level of commitment
· 3 to 4 hours per week
·       Be willing to commit for 12 months in the role.Â
Minimum Requirements Qualifications:
- Grade 12
- Completed Data Engineering, Information Systems, Computer Science, Statistics qualification or a related field/has data engineering certifications (from Udemy, LinkedIn, Data camp etc)
- Experience in enterprise Data Engineering or similar Experience in data pipeline development, database management, and workflow automation
- Strong skills in programming languages and frameworks such as Python, SQL, or similar tools
- Proficiency in cloud computing platforms and data storage solutions
- Excellent problem-solving and analytical skills
- Ability to work collaboratively within a remote team environment
- Familiarity with data security practices is a plus
- Interest in education and learning-focused initiatives is highly valued
- Access to a personal computer/laptop and reliable internet connection.
- Passion for using data responsibly to drive educational and social impact.
Join Our Team:
By joining the D&A Team, the Data Engineering Division you will gain hands-on exposure to real-world data architecture, covering data modeling, enterprise strategy, and cloud integration-skills increasingly demanded in analytics, engineering, and leadership roles
You will help ensure that data used to support thousands of learners across South Africa is ethical, secure, and trusted.
How to Apply:
Please submit your CV to volunteer@dubsmaths.org, include "DataOps Engineer Application" in the subject line. We thank all applicants for their interest in joining our team.
Application Deadline:
11 February 2026Â
Similar Jobs
Explore other opportunities that match your interests
HIREXE
HIREXE