Skip to main content

Why the use of AI in further education is a revolution we must shape, not surrender to

13 February 2025

Anthony Bravo, Principal and Chief Executive of Basingstoke College of Technology (BCoT)

The government is pushing for a digital revolution in education—and rightly so. Every major leap forward, from the steam engine to the industrial revolution, has transformed the way we work, live, and learn. AI is no different.

This technology will have a massive effect on our colleges, but only if addressed and embraced in the right way. We cannot afford to be passive passengers on this journey. We must steer AI’s development in FE, ensuring it serves learners and staff—not just the interests of big tech companies.

Because let’s be honest: What are the tech companies’ real incentives? Are they focused on improving learning outcomes? Or are they more interested in capturing students’ attention and creating intimate connections with them?

As Charlie Munger said, "Show me the incentive, and I’ll show you the outcome." Right now, we as FE leaders must align our voices and demand transparency, accountability, and AI that truly enhances education.

The government recognises AI’s potential to reduce workload and improve teacher retention—and they are right. Teachers are our most valuable resource, and we must protect their wellbeing.

At Basingstoke College of Technology (BCoT), we’ve already seen AI:
- Reduce admin burden—saving staff an average of five hours or more per week in lesson preparation and marking.
- Enhance lesson planning—AI helps generate ideas, enhance resources, and adapt teaching to different learning needs.
- Improve recruitment—AI has potential in assessing teaching ability for new hires, ensuring the right people enter the profession.

The dream is that AI could significantly impact teacher workload, allowing them to spend more time focusing on the art of teaching.

That’s the vision we should all be working towards. But we must remember that AI should never replace the teacher, AI-generated content must always be fact-checked, and I must be a tool to support staff, not add more complexity to their jobs.

AI also has the potential to enhance the student experience, not just engagement. At BCoT,it is already improving the student journey from the moment they apply. AI-powered student tools are given to learners to use on their course - we want to provide safe sandboxes for them to learn alongside the machine. AI can also break down concepts in ways tailored to students' career aspirations and interests. Finally, it can offer support for English for speakers of other languages (ESOL) and special educational needs and disabilities (SEND) students with AI-powered translation, voice-to-text, and accessibility tools making learning more inclusive.

But here is the problem: Tech companies design AI to maximise engagement, not learning. And there is a big difference between genuine learning and simply keeping students hooked on a platform.

At BCoT, we see a lot of tech like junk food - Easily available, highly appealing—but not always good for you. We must build in friction, teaching students to use tech, including AI, intentionally—not mindlessly. It is worth remembering AI’s impact is like second-hand smoke—its unintended consequences affect everyone.

That is why we take digital wellbeing seriously. Every student at BCoT must complete an ‘AI Driver’s Licence’ before using AI tools, learning how to spot deepfakes and misinformation, how to use AI ethically and how to think critically about AI-generated content.

AI must empower students, not manipulate them. If we let tech giants dictate AI’s role in education, we risk creating a system designed to maximise screen time—not knowledge.

We must try and hold tech companies accountable. Right now, it feels like we’re in a Wild West scenario—where AI is advancing faster than our ability to regulate or understand it.

We take a three-step approach before adopting any new AI tool:

  • Does it actually improve learning? There needs to be a real, measurable improvement in learning.
  • Does it meet IT? We won’t compromise security.
  • Does it align with GDPR expectations? Is the data protected and used in a reasonable way to help with the first step?

We have standards for medicine, for electrical products—so why do we not we have them for AI in education?

The DfE Connect platform is a good step forward—providing a central hub for AI resources. But we need more than just a library of information. We need:

  • A national AI framework for FE so we don’t all waste time reinventing the wheel.
  • Sector-wide agreements with tech companies, ensuring data transparency and ethical AI use.
  • A focus on AI safety, regulating AI in the same way we regulate other powerful technologies.

    If we don’t set these terms now, AI in education could be shaped by commercial incentives, not educational ones.

    AI is advancing at an unprecedented speed. Faster than the industrial revolution. Faster than the transition from steam to flight.

    The technology in AI is advancing at an incredibly high rate. I cannot imagine where we will be in another 10 years’ time. This is not just another wave of digital change—this is a fundamental shift in how we think about knowledge, teaching, and the role of educators.

    We have an enormous responsibility to get this right.

    The FE sector has the agility, the expertise, and the real-world focus to lead this transformation. But we cannot do it alone. The government must provide clear guidance and support, ensuring AI enhances, not undermines, further education. FE leaders must work together, applying aligned pressure on tech companies to demand AI that serves education, not engagement. And lastly, we must champion ethical AI use, ensuring students and staff develop critical AI literacy, not blind dependence. AI should empower, not exploit. AI should free up teachers, not replace them. AI should enhance learning, not just capture attention.

    If we get this right, AI could be the biggest enabler of inclusion, efficiency, and educational excellence we have ever seen. If we get it wrong, we risk becoming the tools of the tools we create.

    Let’s make sure AI serves education—not the other way around.

    What do you think? How do we ensure AI benefits all educators and learners—without compromising our values?