Ƶ

Logo

What’s next for AI in higher education?

From assessment to ethics and job security, a new Jisc report highlights AI’s challenges and successes and provides insight into upcoming developments

Michael Webb's avatar
Jisc
4 Aug 2022
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
AI concept

You may also like

AI has been trumpeted as our saviour, but it’s complicated
A robot marks a book. Aritificial intelligence has been trumpeted as our saviour, but the truth is much more complicated.

It’s clear that artificial intelligence is already adding value in the higher education sector. However, adopting AI can be daunting for institutions lacking the time, expertise and resources to explore its many uses.

So, while today’s teachers may well be marking certain subjects with AI and learners may be using it to find out when their next assignment is due, how will AI help them in the future? And what issues do we need to start considering now?

<Ƶ>AI will never replace teaching staff 

The first thing to emphasise is that AI will never replace teaching staff – nor would we want it to. But it will allow them to spend more time connecting with their students, whether that’s in person or virtually. With a reduction in educators’ workload, their time can be freed up to concentrate on higher-impact tasks, leading to better outcomes and a more fulfilling, effective experience for learners. 

<Ƶ>AI can provide personalised experiences 

We see this in our daily lives, and we are already seeing students benefit from this in areas such as adaptive learning tools. The challenge for the next few years is to work out exactly how to continue harnessing this capability to provide most benefit.

Enrolment and induction for students can often be manual and complicated and can take a long time to complete successfully. Digital assistants or chatbots, such as the one , support students and staff alike. ’s “deep dive” sessions with organisations across the sector have highlighted the opportunity for extending the use of AI to support the student from application all the way through to graduation.   

Having a smooth and clear journey will help support students and prevent early dropouts. 

<Ƶ>AI will make learning more inclusive and accessible 

AI will also make learning more inclusive and accessible for those with language barriers or disabilities. We will increasingly be able to use AI to provide access to learning in a format that is personalised and formatted according to the needs of the learner – for example, the same content could be provided as audio, video or text, all communicated at the right level. 

AI is also allowing the creation of tools such as virtual assistants that provide additional guidance and support to students with specific needs. 

AI assistants that provide support to students at any time of the day show great promise, as does using AI to tailor students’ learning journey – although we need to make sure this kind of development involves all stakeholders and isn’t driven by technology.

There are questions to ask, though: for example, around how we balance the concept of personalising learning materials and maintaining or enhancing learner agency. 

<Ƶ>Using data to drive the right outcomes 

Advances in AI, along with an increase in available data, provide new opportunities in areas such as learning analytics solutions, which provide indications of students at risk of failure or dropping out. Currently, acting on outputs of predictive models is a challenge, because these have very much been “black box” technologies – we can see that a conclusion has been reached by an AI model but not why. We are now seeing advances in AI explainability, which means much greater insight into the factors affecting any particular situation.  

The value of being able to draw insights from this means that improving data and analytics will continue to be top of the list for universities. 

<Ƶ>Advice on ethical use of AI 

Tools and techniques are emerging that provide particular ethical challenges. For example, there are several AI techniques that aim to understand human emotion, and organisations may want to start thinking and talking about what place (if any) these kinds of tools have in higher education.   

For example, AI systems can be trained to detect emotion in human expressions and speech, and it is likely that these kinds of tools will be built into videoconferencing solutions in the future. AI could be used to identify the sentiment and emotional dimension of student communications from survey results, social media channels, forums and student group work. These data could then be used to ascertain performance and identify areas for improvement – a task that is often undertaken manually at the moment.

Should we do this, though?  

’s provides tools to help universities start to consider these sorts of questions. 

<Ƶ>Be prepared 

We need to begin preparing our systems and data now to support integration and allow development of more of these kinds of systems in the future. Jisc is uniquely placed to work together with the higher education sector to enable institutions to plan how they will use AI efficiently, effectively and ethically. 

To find out more, take a look at , ’s latest report on AI in tertiary education.

Michael Webb is director of technology and analytics at Jisc.

This is an edited version of a blog, “?” originally published on the Jisc website in June 2022.

If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, .

Loading...
<Ƶ id="you-may-also-like" class="css-sfp6vx">You may also like
sticky sign up

Register for free

and unlock a host of features on the THE site