Government Pilot for AI Tutoring: Our Response

We welcome the government's focus on widening access to high-quality tutoring. It reflects what we who work directly with young people have long understood: disadvantage, not ability, remains the biggest barrier to attainment for too many children.

The evidence is clear. One-to-one support can be transformational. When delivered well, it doesn’t just raise attainment it rebuilds confidence, restores trust, and helps accelerate learning. 

At Equal Education, we’ve spent more than a decade supporting pupils whose education has been disrupted by care experience, unmet SEND needs, exclusion, or instability. Used well, AI has the potential to strengthen this work at scale, helping identify gaps earlier, offering targeted practice, and freeing up teachers’ and tutors’ time to focus on what matters most. 

That opportunity is genuinely exciting especially if we get the foundations right.

AI is a powerful tool, and like any powerful tool, similar to iPads, its impact depends on how it’s designed and deployed. With the right safeguards in place, it can support scale, consistency, and insight. With the right oversight, it can enhance and not dilute the quality of support for learners who need it most. As an engineer by background, I’m excited by what AI can do, but I’m also mindful of the risks. When it comes to children, particularly children facing disadvantage, careful, responsible rollout isn’t optional.

Crucially, innovation cannot stand alone. Investment in new tools must go hand-in-hand with sustained investment in teachers, tutors, and the relationships that help young people feel safe, supported, and ready to learn. For many pupils, progress depends as much on trust, emotional safety, consistency, and skilled professional judgement as it does on academic input. 

Evidence shows that children’s behaviour is strongly shaped by their environment, and that outcomes improve when positive choices are made easier to make. In the same way that healthier food environments lead to better habits, digital learning environments must be intentionally designed to reduce harm and support safe, meaningful learning. Online environments must be intentionally structured to reduce risk and support safe, focused learning experiences.

That’s why it’s encouraging to see plans for teacher-led co-creation of AI tutoring tools, with the government working alongside teachers, AI labs, and technology companies to robustly test these tools and to treat safety as non-negotiable. Like any powerful technology, AI opens a Pandora’s box of both promise and risk. The strongest systems will be built where technologists work alongside those who understand classrooms, vulnerability, and complexity.

As an organisation with deep experience supporting under-resourced young people and learners, we welcome the opportunity to engage constructively with this programme. We’re keen to share what we’ve learned about what works, where technology adds value, and how innovation can genuinely narrow gaps rather than unintentionally widen them.

CEO and Founder, Paul Singh

Next
Next

Expanding Our Offer – Introducing Group Tuition