Some UK universities are using "AI teachers" to deliver lessons

ai
Tuesday, 25 November 2025 at 22:44
AI Student
At Staffordshire University, some students say their programming course left them feeling like they weren’t learning anything at all. What they thought would be a fresh start in tech turned into a class mostly built and even narrated by AI. They were listening to AI Teachers. Instead of feeling inspired, many walked away disappointed.
sad
James and Owen were part of a 41-student apprenticeship program funded by the government. They joined, hoping to switch into cybersecurity or software engineering.
But by the end of the first term, James said he had lost confidence in the course. To him, the lessons felt cheap and rushed. “We’d get in trouble if we turned in AI work,” he said, “but we’re supposed to sit through lessons made by AI? How does that make sense?”

A University Standing By Its Decision - AI Teachers

Students questioned the materials again and again, but the university kept using them this year. Staffordshire even posted a document explaining that the content fit into a plan for teachers to “automate tasks” with AI.
What bothered many students was that the school’s official policy takes a tough stance against them using AI. It warns that passing off AI-generated work as your own counts as academic misconduct. For someone like James, who is midway through his life and career, that double standard felt especially unfair. He said he couldn’t just drop everything and start over.

AI in Classrooms Is Becoming Normal

This isn’t happening only in Staffordshire. More universities are turning to AI to make course content, grade papers, and give feedback. A UK education policy paper from August said generative AI could reshape how schools work. A survey by Jisc also found that nearly one in four university lecturers already use AI in class.
But many students aren’t seeing the upside yet. In the U.S., some say their professors lean too heavily on AI tools. In the UK, students on Reddit claim lecturers copy ChatGPT comments or use AI-made images without checking them.

Red Flags From Day One

James and Owen said they spotted problems almost immediately. Their lecturer played slides with an AI copy of his own voice reading off the text. The writing bounced between American and British English.
Some slides referenced U.S. laws for no reason. Even this year’s material had odd glitches, like a video where the narration suddenly switched to a Spanish accent before switching back.
When The Guardian checked the course materials with two AI-detection tools, both suggested that parts of the assignments and presentations were likely AI-generated.

Students Speak Out, But the Help Comes Too Late

James raised concerns early on. Later in November, he spoke up again in class and asked the lecturer not to use those slides. “Everyone knows this is AI-made,” he said. “I don’t want to be taught by GPT.”
A student representative said they had already complained, but the university responded that teachers were free to use different tools.
Another student estimated that maybe 5% of the content was useful. The rest felt repetitive. “If the good stuff is something we can get from ChatGPT anyway,” he said, “why bother with the rest?”
One lecturer even admitted that a tutorial had been thrown together at the last minute using ChatGPT. The course leader later promised that the final class would be taught fully by human lecturers.
Staffordshire University told The Guardian that the course’s academic standards were not damaged and said AI was only meant to help, not replace, teaching.
But for James and Owen, this reassurance came far too late. James said he felt like “a part of his life was taken from him.” Owen said the whole experience left him frustrated, knowing his time could’ve gone into something far more worthwhile.
loading

Loading