Treading the tightrope: AI in education
How can generative AI be used in education? And how can you support your students to use it responsibly?
Created in partnership with
Created in partnership with
Since the rapid rise of ChatGPT, educators have increasingly grappled with difficult questions over the use of generative AI. How will this impact assessment? Does the use of AI reduce learning? And what does this mean for the future of education?
The timing of the Covid-19 pandemic and the arrival of ChatGPT created a perfect storm with an increase in online assessment. But the need to assure the integrity of qualifications has led to a return to invigilated, in-person exams to combat the use of generative AI. While this move has been very effective, it comes at a cost.
But just as the invention of the electronic calculator didn¡¯t stop the teaching of arithmetic or put an end to maths tests, generative AI could benefit learning and assessment and enhance the student experience. UCL¡¯s president and provost, Michael Spence, on how we are embracing the use of AI at UCL, from developing AI-proof assessment strategies to encouraging students to experiment with AI to debug code or improve their writing structure.
To help explore how we can walk the tightrope between incorporating and curbing the use of AI in education, the Sainsbury Wellcome Centre (SWC) at UCL has been that take the lid off the black box of generative AI. The aim is to empower both teachers and students to understand, and thereby take back control of, this technology.
What can generative AI be used for in education?
Following the inaugural workshop for teachers at SWC, attendees have been experimenting with using generative AI as a revision tool. By using specific prompts such as ¡°Produce a 10-question assessment for the OCR A-level physics topic, ¡°Waves¡±, with a mark scheme ¨C it must have at least one six-mark question¡±, teachers have been able to refine the results for use in the classroom.
Interestingly, some teachers have also encouraged their students to use the technology in this way to create their own revision resources, however they have seen some reluctance because students might not always be able to spot if there is a mistake.
Similarly at university level, some educators who attended the workshop have encouraged their students to make use of generative AI to practice possible questions ahead of PhD vivas and job interviews. But again, it is crucial to highlight the importance of fact-checking.
All UCL students are encouraged to take our course to ensure they understand how to use AI tools effectively, ethically and transparently. UCL¡¯s guidance emphasises the need for students to critically evaluate outputs, and to be mindful of the limitations and risks of generative AI.
Watching out for pitfalls
Because of the way large language models, such as ChatGPT, work, they do not have to stick to the truth. They are simply using large databases in related topical areas to predict the next word in the sequence. Sometimes this can lead to ¡°hallucinations¡± where false information (that is often seemingly plausible) is presented as fact.
Generative AI can also sometimes produce text or images that are very close to the original source in the database, which can potentially result in plagiarism or even copyright infringement. In addition to these practical challenges, there is also a wider problem of becoming over-reliant on the tools. It is important that students don¡¯t use the tools to take shortcuts that negatively impact their learning.
At UCL our helps staff and students determine how generative AI tools can be used in assessment, from banning the use of such tools (where failure to comply could result in an academic misconduct procedure) to creating assessments where AI is integral. AI-centred assessment talks could involve asking students to critique AI responses to essay prompts, developing code, or prompting AI to act as a ¡°Socratic opponent¡± as spotlighted by , during the 2023 UCL Education Conference.
Testing understanding in the era of generative AI
Although generative AI can be extremely useful for producing comprehensive summaries that help gather insights and nurture curiosity in a new field of study, it is crucial that students don¡¯t rely too heavily on it. After all, the primary purpose of using ChatGPT and other LLMs should be to aid understanding and not used solely as a shortcut to complete assignments.
But how can we test understanding in the era of generative AI?
In summer 2023 UCL ran an AI Co-Creators project, which enabled staff and students to work collaboratively on the possibility of building an AI assistant that could generate educational questions using lecture transcripts, which could then be used by lecturers to create quick quizzes to review comprehension, and for students¡¯ self-assessment.
An innovative boot camp for neuroscience MSc students at UCL, which openly allowed the use of generative AI and other tools, trialled asking students to record a short video each day that focussed on their project progress and how they overcame challenges.
This shift to concentrating on how students solved problems allowed educators to evaluate understanding.
While it will be important to validate this more freeform approach and take account of biases, it is an interesting example of how we can consider making assessments more focused on critical thinking skills. The need for such analytical skills and information literacy is evident in today¡¯s world. It is clear we are living in an age where information is freely available online, but it is increasingly vital that we all critically analyse the information we consume to avoid being taken in by ¡°fake news¡±.
Future of generative AI in education
With the continuous advances in generative AI, it is difficult to predict what the future holds for education. One hope from scientists at SWC is that generative AI could be used to customise learning. In the same way that personal tutors tailor tuition based on what their student does and doesn¡¯t understand, generative AI could potentially democratise access to personalised learning by finding and defining gaps in knowledge and helping fill them.
With knowledge no longer a bottleneck, traditional education could become extremely human, focusing on the critical thinking skills and unique insights that give rise to new understanding. These skills appear harder for generative AI to tackle, for now at least¡.
Resources