New, game-changing AI tools been released over the past year and they are making waves already. Known as Generative AI, they are being used to create content, including text, images and computer programmes. The most obvious and well-known tool is Chat-GPT, but there are plenty more. They can be very effective learning tools, designed to engage users in conversation, respond to commands and create text that looks as if it was generated by a human. But they are not without their problems and will potentially require L&D to change how learning is assessed.
The tools have been developed using large sets of online text. This enables them to predict the next word in a sentence, to reproduce different genres and styles and to switch from one language to another. These tools are impressive and are gaining a lot of traction very quickly, but there are limitations, one of them being social bias. Because the dataset comes from text that is readily available online, it isn’t neutral – it reflects the opinions of people who create text online. The tools can reproduce and amplify any biases existing in its training dataset, to the extent that OpenAI, the company behind ChatGPT, has admitted the bias issue. Hallucinations - when the tools produce content that is factually incorrect or irrelevant – is another limitation.
Some established tech providers have also started to incorporate Generative AI into their offering – Duolingo, for example, plus Microsoft plans to include it in all its tools.
According to Mike Sharples, a former Innovating Pedagogy author, Generative AI will be used to support learning in several ways:
Because the tools generate text that looks so authentically human, L&D needs to consider how assessment is conducted in future. Assessment might take the form of real-world tasks, reflections on task completion or critiquing of AI responses, for example.
Dr Mhairi Aitken, Ethics Fellow in the Public Policy Programme at The Alan Turing Institute, Honorary Senior Fellow at the Australian Centre for Health Engagement, Evidence and Values at the University of Wollongong. Listed in the 2023 list of “100 Brilliant Women in AI Ethics”.
Generative AI, particularly tools such as ChatGPT, have been integrated across all functions, including L&D. There has been much experimentation around how these tools can be applied, driven by the rapid pace of innovation. When ChatGPT came out in late 2022, there was a flurry of excitement, with organisations rushing to demonstrate that they were using these technologies. There was a lot of FOMO (fear of missing out), driving much of the experimentation. As a result, there was lots of unrealistic hype. We are now coming out of that hype. There is still excitement around the tech, but also more understanding of the limitations.
The tools enable the creation of more personalised and adaptive content, with content tailored to an individual’s learning preferences and abilities. The tools can be used to analyse learners’ progress, creating data around learner profiles, existing skills and preferred ways of learning. There are real benefits to that. However, there are also concerns about how these tools are being used. In some cases, using generative AI to create learning content is seen as a shortcut – a way of generating content quickly. But, the tools perpetuate biases and stereotypes so organisations need to be very careful about checking content to ensure that it’s appropriate and without bias or stereotype. When used responsibly and properly, with time taken to review content and ensure it is appropriate, they aren’t a shortcut at all.
I think we will see AI used effectively for more mundane tasks - summarising text, for example. L&D and learners could use it to summarise information, to reformat and rewrite texts, but it’s really important that people check that content has been summarised accurately. We can’t rely on tools like ChatGPT for factual content. It’s a stylistic tool really, one that can support learners and L&D, but it can’t replace creativity or intellectual processes.
Another benefit is using the tools as virtual tutors or online accessible support, giving 24x7 support to students. But, these tools should be complementary to establishing learning methods, rather than replacing them. And while the tools can widen accessibility to learning, it’s important to think about digital divides so that they don’t exacerbate existing inequalities – disadvantaging those who don’t have a reliable internet connection or lack confidence with technology, for example.
Top tips:
These tools also make us rethink the role of learning professionals and learning in general. What is it that the tools can do? AI will replace some of the functions of learning professionals so we need to think about learning professionals working hand in hand with AI to enhance learning.
Professor Agnes Kukulska-Hulme
Institute of Educational Technology, The Open University
Resources:
The Innovating Pedagogy report is an annual report co-authored by academics at the OU's Institute of Educational Technology, and this year, together with researchers from the LIVE Learning Innovation Incubator at Vanderbilt University in the US. |
Please contact us to speak to one of our business team advisors.
Sign up to receive regular emails that are full of advice and resources to support staff development in your organisation.