From education to employment

AI in Further Education: A Critical Look at the Risks to Teacher Autonomy

Richard Foster-Fletcher

If FE was a grand orchestra—a diverse group of musicians with different skills, instruments, and styles—its educators would be the conductors, carefully harmonising each musician’s contribution to create something greater than the sum of its parts. AI, in this analogy, is the promise of a new, advanced conducting tool. It claims to keep everyone perfectly in time, ensure every note is flawless, and allow each musician to shine without human error.

But before placing all our faith in this tool, we must ask: will this automated conductor understand the unique timing, the nuanced emotion, or the subtle improvisations that each piece requires? Will it enhance the orchestra’s performance, or will it simply drown out the individuality of each musician, leading to a technically flawless but emotionally hollow piece?

This analogy frames the critical debate around AI in FE: can it truly support the complex, human-centered task of education in ways that celebrate diversity, adaptability, and personal connection? Or does it risk removing the nuance, creativity, and individuality that bring education to life? These are not abstract concerns—they are pivotal questions that demand urgent attention as we consider AI’s integration into the sector.

1. Skill Disparity in AI Usage: Will AI Bridge Gaps or Deepen Divides?

The integration of AI into FE is predicated on educators’ ability to use it effectively. But in a sector characterised by diversity—of teacher backgrounds, digital literacy, and professional experience—this ability is far from uniform. Prompt engineering, the art of crafting precise instructions to yield meaningful AI outputs, requires not just technical skill but pedagogical intuition and creativity.

Some educators will quickly adapt, using AI to enhance their teaching and provide personalised, engaging experiences. Others may struggle, particularly those with limited access to training or confidence in new technologies. This disparity could create a two-tier workforce: one group using AI to augment their practice in dynamic ways, and another reliant on superficial, unrefined outputs that fail to meet learners’ needs.

The promise of AI is its ability to democratise access to high-quality resources, yet this promise will only be fulfilled if all educators are equipped to harness it. Without equitable professional development, AI risks amplifying inequalities, privileging those who are already digitally fluent and further marginalising those who are not. Moreover, if AI tools are designed to be “easy” to use, does this simplification encourage shallow interactions with teaching content? Are we at risk of standardising education, reducing it to a transactional process of input and output?

AI could elevate teaching across FE, but only if its use is supported with the necessary training and resources. Will it fulfil its promise of inclusivity, or deepen the divides within an already stretched sector?

2. From Educator to AI Curator: The Risk of Diminishing Professional Expertise

FE educators do not merely deliver content; they design and adapt it. They draw on industry expertise, creativity, and a deep understanding of their students to shape meaningful learning experiences. AI tools that promise ready-made lesson plans, assessments, and activities threaten to shift this dynamic, turning educators from creators into curators.

If AI takes over the creative aspects of teaching, what happens to the distinctiveness of the educator’s role? A machine-generated lesson plan may be efficient, but it lacks the depth, personalisation, and contextual relevance that a teacher brings. Over time, educators who rely on AI may find their professional identity eroding, their expertise overshadowed by pre-packaged content. This risks de-professionalising teaching, reducing it to the facilitation of algorithmic outputs.

The implications for learners are equally significant. FE students often thrive on the relational and contextual aspects of teaching—moments when a teacher connects theory to practice with a personal story or adapts a concept on the fly to suit the group. Can AI, however sophisticated, replicate this nuance? Or do we risk replacing a dynamic, human-centred approach with something efficient but fundamentally hollow?

AI can reduce workload, but at what cost? How do we ensure educators remain at the centre of pedagogical design, preserving the creativity and personalisation that define effective teaching?

3. De-skilling: The Hidden Cost of Dependence

FE teachers are valued for their adaptability, problem-solving, and ability to meet diverse learners where they are. AI promises to take on routine tasks, freeing educators to focus more on student relationships and engagement. But this convenience brings a hidden cost: the potential erosion of key teaching skills.

As AI generates lesson content, adapts material, and automates feedback, educators may begin to rely on these tools to such an extent that their own capabilities diminish. Over time, the instinctive skills that underpin effective teaching—curriculum design, differentiation, and on-the-spot adaptability—may atrophy. This doesn’t only affect individual educators; it impacts the entire sector’s ability to respond to future challenges.

What happens when technology fails, or when nuanced, context-specific decisions are required that AI cannot address? Will educators, having grown accustomed to automation, still have the confidence and capacity to innovate independently? And more fundamentally, does dependence on AI risk turning a profession defined by creativity and adaptability into one of routine execution?

AI promises efficiency, but how do we prevent it from hollowing out the core competencies that make educators effective? Can the sector maintain resilience and adaptability in the face of increasing automation?

Broader Reflections: What Kind of Conductor Do We Want AI to Be?

AI in FE is not just a technological issue—it’s a question of values. Its integration must be guided by a clear vision of what education should achieve and the role of teachers within it. Are we using AI to empower educators and learners, or are we inadvertently redesigning the sector in ways that strip away its humanity and depth?

Institutional policies, professional development, and the culture within FE will shape how AI is used. Leaders must ask: How can we implement AI in ways that support teachers as creative, adaptive professionals? What kind of training is needed to ensure all educators can use AI effectively, and what safeguards must be in place to preserve the unique strengths of human teaching?

Equally important is the learner’s experience. FE students often come to the sector because they need a different kind of education—one that is personal, flexible, and engaging. If AI makes teaching less human and less connected, does it risk alienating the very students it is meant to serve?

At this critical juncture, we need to ask more questions about not just how AI can be used in FE, but in what ways its integration aligns with the values and aspirations of the sector. Only through rigorous reflection and debate can we shape an approach that enhances education—preserving its humanity while embracing the potential of technology to support, rather than replace, the role of the educator.

By Richard Foster-Fletcher, Executive Chair of MKAI


Related Articles

Responses