榴莲视频

Logo

AI in higher education: dystopia, utopia or something in between?

To understand how HE can incorporate AI successfully, we need to think about how humans will interact with the technology and change their behaviour, says Ben Swift

Ben Swift's avatar
13 Oct 2022
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
Artificial intelligence: a dystopian future for higher education?

Created in partnership with

Created in partnership with

Australian National University

You may also like

AI has been trumpeted as our saviour, but it鈥檚 complicated
A robot marks a book. Aritificial intelligence has been trumpeted as our saviour, but the truth is much more complicated.

Popular resources

AI applications are already part of the higher education experience for students, instructors and administrators. Some of them are chatbots and intelligent tutoring systems, some are auto-grading and feedback apps, and some are used in academic integrity breach detection and exam proctoring.

We鈥檙e also on the crest of a wave of new for text/image synthesis, where you give the AI a prompt such as 鈥淲hat role did Sir John Kerr play in the 1975 Australian constitutional crisis?鈥 or 鈥淒raw a picture of a red unicorn playing a Fender Stratocaster鈥 and it will spit out a 鈥渞esponse鈥 which, while not always perfect, in most cases could pass for something hacked together by a harried student in the few hours before an assessment deadline.

I鈥檓 writing as someone with 10 years鈥 experience as a lecturer and course convener in and . I鈥檝e taught both large (400-student) compulsory courses and 10-student courses. I鈥檝e also built software tools for automating some aspects of these courses, although they more commonly use .

However, as an AI researcher, I also build AI-powered tools 鈥 and I can certainly see the convergence between the 鈥淎I research and tool-building鈥 part of my job and the teaching part.

To understand the way in which AI will transform higher education, it鈥檚 useful to consider the interactions between human and AI parts of the system, rather than focusing on individual AI tools in isolation. For example, will the AI essay generators stay ahead of the AI plagiarism-detection bots? Will the AI tutoring apps lighten the workload of our teaching assistants, or will the workload just shift to helping the students use the AI tutoring apps?

To understand what is happening with the introduction of AI into the higher education experience, it鈥檚 crucial to realise that so much of the student and instructor experience in a course is about flows of information. For example, an instructor creates an assignment spec, which is sent to the student. In response, the student (synthesising many sources of information, from both the course curriculum and elsewhere) produces and creates an assignment artefact (such as an essay). This artefact is graded by an instructor, and both a numerical mark and qualitative feedback are sent back to the student 鈥 another information flow, which will inform the students鈥 work in subsequent assignments.

Don鈥檛 get me wrong, I鈥檓 not saying that this is all there is to participating in a university course 鈥 crucially, the human community aspect is missing in the above description, for a start. However, thinking about the above information flow gives us a helpful perspective for considering where AI might amplify or dampen the different information flows within the system, or where it may give rise to new ones.

There are three potential 鈥渟ystem dynamics鈥 I鈥檓 on the lookout for as AI becomes more deeply integrated into higher education.

First, while it鈥檚 less clear whether the aforementioned AI text and image synthesis tools will make the best student work even better, it鈥檚 pretty clear that they will allow students who only care about passing without actually attaining the course learning outcomes to do so with much less effort. The implication for instructors is that if you鈥檙e grading a text/image artefact it鈥檚 now much harder to tell whether the artefact is the work 鈥渙nly鈥 of the student or whether they had the help of an AI tool to create it. In other words, if the question of whether AI was involved in the creation of an artefact really matters, it will be increasingly hard to give a definitive answer, especially without specialised expertise and under the time pressures that instructors have to complete grading.

Second, there are going to be feedback loops involved. For example, a big selling point of AI chatbot products is that you can teach larger classes (or create new classes) than you would otherwise have the instructors to support. AI text summarisation tools could also help with grading/triaging, especially given the limits on budgets for teaching assistant hours. One potential endgame for this dynamic is that instead of having to , class sizes could grow until student demand is satisfied.

The risk here is that such a class would become incredibly reliant on those AI tools to handle its teaching workload without burning out all the humans involved in the process. And humans will still be involved, since (almost) nobody is proposing that we have purely automated classes in higher education.

Third, human-AI co-creation isn鈥檛 going anywhere, so make it part of your assessments. Get students to design new front ends and workflows that other students can try out. How about an essay-writing assignment where the students are encouraged to write the topic sentences for each paragraph and use AI to complete the rest? The students could then critically reflect (and be assessed) on their process of iteratively poking the AI (via the topic sentence prompts) to ensure a coherent overall argument for the essay. Alternatively, using the 鈥渞everse assignment鈥 approach, the instructor could enlist the help of AI to write an assignment spec and have the students come up with a rubric and suggested improvements to their assignment spec as their deliverable.

Finally, I do wonder whether (and hope that) some of these AI tools might make contract cheating less profitable as a business, because the humans that provide those services will be automated away as well 鈥 although, admittedly, the cheating-industrial complex is well positioned to take advantage of the AI-enabled future of higher education, as those involved have probably got the best databases of instructor- and student-created content on the planet.

The main takeaway here is that AI tools in higher education won鈥檛 operate in isolation; they鈥檒l become part of the system, where students can churn out passable essays faster, but instructors can also grade them faster. It鈥檚 unclear which 鈥渟ide鈥 of this transaction will win out, or which balancing mechanisms (natural or regulatory) will be required in response, so it鈥檚 important to design your class so that such 鈥淎I content arms races鈥 aren鈥檛 so likely.

There鈥檚 a dystopian 鈥渇uture of AI in education鈥 scenario in which AI-generated assignments are graded by AI grading and feedback bots, with dull-eyed human teachers and students who are largely disconnected and disenfranchised. But I鈥檓 not in this dystopian camp. I am, however, keeping an eye out for how human students and instructors change their behaviour in response to the changes in the information ecosystem in which we exist.

is educational experiences lead and associate director (education) at the ANU School of Cybernetics. The ANU School of Cybernetics is activating cybernetics as an important tool for navigating major societal transformations through capability building, policy development and safe, sustainable and responsible approaches to new systems. 

If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, .

Loading...
<榴莲视频 id="you-may-also-like" class="css-sfp6vx">You may also like
sticky sign up

Register for free

and unlock a host of features on the THE site