AI: The Powerful Ally Needing to be Tamed

I am fascinated by AI. It is clearly a technology that will transform almost everything we do in life and in work. A personal inflection point has been reached this week with Chat GTP replacing Google as my go to source of information. After a bit of information? I no longer Google it, I ask Chat GTP. Almost every week I find a new use or function in an AI technology that fills my head with possibilities. However, it often feels like a word on the tip of the tongue, the idea is there but I can’t quite access it. I am almost certain I am not alone in this feeling.
It is in this context that the Association of Employment and Learning Providers hosted our annual AI & Digital Summit. Sponsored by Bud, this was a gathering of FE professionals eager to close the gap between the idea and the reality when it comes to AI. The overwhelming sense coming out of the summit was that in many cases providers are well on their way to integrating AI into their practice. This shouldn’t be a surprise; our sector is full of innovation and exploring new ways to focus on the learner.
I have lost count of the number of reports that report productivity gains from using AI. Citing just two, Workday reported that update of AI could unlock £119 billion in productivity across the UK economy, while the Productivity Institute reports a potential 1-2% gain in productivity. Whatever the report, the answer always goes in the same direction – more AI = more productive businesses.
Meanwhile, the further education sector feels pressure on several fronts; workforce pressures, limited funding and overbearing regulatory oversight to name just three. Could AI be the key to allowing us to focus more on the learner and less on paperwork?
If you think creatively, then there are uses for AI technology all the way through the learner journey. When recruiting new learners or new staff, why not use AI technologies to help create marketing materials, and assist in understanding marketing analytics. Inductions could be tailored and the time burden on senior staff reduced through video creation technology. Training learners could be significantly enhanced by using AI tutors, something I know is already is happening, and something I use myself when learning new concepts. Progress monitoring can be made more consistent and much more individualised. Completion and assessment can be in a large part automated (ethics to follow!), and finally careers advice can again be unique to the individual learner and not limited to the knowledge of an individual.
This is just a snapshot. Sounds great right? There are however two sides to AI, and the ethics and data protection discussion is never far away. Most obviously the ethical questions behind automating assessments are unavoidable.
AI is trained on data, and the biggest large language models are effectively trained on the data available online. Would you consider the internet an area free from bias and a home to accurate information? Didn’t think so. This is why it is critically important when implementing AI into any business or system to keep a human in the loop. “Human in the loop” was the most repeated phrase at our summit. Taking another example from the above list, you wouldn’t let AI create marketing materials and distribute it without first having a highly qualified and competent professional check the material for quality and values. Humans need to interrupt and intervene where necessary.
Ultimately AI tools are there to support, not replace, humans. Common sense and decency need to be applied external to the technology.
Based on this I want to share my three practical takeaways from our AI & Digital summit for those who couldn’t make it, or for those who want to explore AI.
- Start small. AI is such a jungle of programmes, jargon and possibilities that it can be easy to get lost in the thicket.
- AI is not the solution for everything. Sometimes simpler digital tools will do, sometimes nothing beats the human touch. Before implementing AI make sure it is the right process for your problem
- Introduce internally first. Don’t use learners as AI guinea pigs, and be aware of the risks of bias and data protection.
The main theme of the summit was delivered at the beginning by our host Richard Parker from Firebrand Training, when he asked whether AI is a powerful ally or a beast needing to be tamed. How about a powerful ally needing to be tamed?
N.B This article was NOT written using AI!
By Ciaran Roche, Public Affairs Manager at AELP
Responses