榴莲视频

If you can’t beat GPT3, join it

<榴莲视频 class="standfirst">We need to prepare students to thrive in a world where they use AI but are not dependent on it, says Mike Groves
十二月 16, 2022
Source: Getty

With the continuing development and accessibility of , the AI language engine, there has been understandable concern that this will undermine the process of essay writing and student learning. This reflects a debate that has been happening for several years in the field of English for academic purposes (EAP), which seeks to prepare and support students whose first language is not English in anglophone universities.

Online translation tools, such as Google Translate, have become spectacularly proficient in recent years at producing grammatically accurate output. Using?them can allow students to appear to demonstrate a proficiency in the language that they cannot produce unassisted. This is especially relevant in areas where EAP plays a gatekeeping role, since students are able to misrepresent their own understanding and competence. But the situation is not as bleak as some might fear, and some of the lessons that the EAP community has learned are directly relevant to GPT3.


THE Campus resource: ChatGPT and the rise of AI writers - how should higher education respond?


First of all, it is important for all involved to recognise the limits of the systems being used. Online translation has become impressive at writing at the level of the sentence, with consistent grammatical accuracy. However, for an academic writer, this is far from sufficient. For example, if text is fed into it that follows the rhetorical conventions of an overseas intellectual tradition – for example, being heavy with simile or full of confidence markers – these are retained when translated. Thus, features of anglosphere academic writing such as caution and impersonal language may be missing. Also, translation engines do not help with organisation or construction of argument. In effect, it only does part of the job.

When I played around with the GPT3 interface, I asked it to write an essay on avoiding plagiarism. The essay seemed informed, but it also missed some of the key features of academic writing. There were no citations or attributions. There was description but no analysis. Where you might expect a student to compare ideas, make connections or introduce drawbacks, the AI engine simply gave more description. It bounced along at the bottom of Bloom’s taxonomy, providing just-passable writing but without anything that could be defined as demonstrative of disciplined critical or analytical thought.

In EAP, we are now talking about “machine-translation literacies”. This involves an honest and open conversation with students about how the technology can help them, and where it cannot. It becomes one of a number of IT literacies that students develop in their time at university – and it would be fair to argue that AI literacies are also ripe for exploration.

Assessment is also affected. To me, it seems anachronistic to prepare students for an academic world where online translation does not exist. If we are preparing them to write essays and reports that can be supported by online translation, we should allow them to develop these competencies as part of the assessment process.

However, we should also be aware that there are frequent times at university when students need to produce language unassisted – from written exams to seminars to the lunch queue. Therefore, EAP assessment needs to make the distinction between supported and unsupported use. We need to assess students in contexts both where they can be assisted by the technology (such as coursework essays and presentations) and, crucially, where they cannot.

Similarly, we need to prepare students to thrive in a world where this AI exists, but we also need to ensure that they do not become dependent on it. Therefore, adaptations to assessment that recognise when they are and are not supported by AI are needed. Coursework essays can still have value, but they need to be supplemented by forms of assessment that cannot be enhanced by AI, such as assessed seminar discussions, critiques and reflections.

The final lesson that EAP has learned from this technology is that it is very hard to legislate against it. There is little consistent policy or regulation, and this has led to a patchwork of inconsistent regulations and advice?that may differ from department to department or even within departments themselves. Moreover, whatever the regulations may be, how do we police them?

More to the point, should we regulate translation engine use? that colleagues and I conducted with students in China suggests that while some are simply using the translation engines to avoid effort, many others are using them in a strategic and nuanced way. They are finding new ways to express complex ideas and checking that their output is well expressed.

The EAP profession, then, can help the students exploit this emergent technology to reach their goals rather than try to limit and control its use. I see no reason why this approach should not be extended to use of GPT3.

AI engines will only improve. As they do so, the potential for them to be misused by students grows. But, at the same time, the potential for this to support and enhance the development of students into global citizens is apparent. Adaptation is possible, but it needs careful consideration, realistic thinking and an understanding of what AI literacies can bring to the academic world.

In this way, we can prepare citizens who work with technology to enhance their intellectual skills rather than becoming diminished by their dependence on it.

Mike Groves?is director of the Centre for Academic English Studies at the Surrey International Institute, a joint initiative between the University of Surrey and Dongbei University of Finance and Economics, China.

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
<榴莲视频 class="pane-title"> 相关文章

人工智能很快就能像人类一样进行研究和写作。那么,真正的教育会被这种作弊的浪潮淹没吗?还是说,人工智能只会成为教学和评估的又一种技术辅助手段?来自约翰·罗斯(John Ross)的报道

7月 8日
<榴莲视频 class="pane-title"> Reader's comments (3)
We cannot respond to the grossly exaggerated threat of AI "writing" without understanding the complexities, contradictions, and multiple interrelationship of reading and writing. See for example, Searching for Literacy: The Social and Intellectual Origins of Literacy Studies (2022)
In the computer science department where I work, we have been experimenting with ChatGTP - we asked it to produce questions on a given topic, which it did quite well with some prompting, & then got it to write model answers to the questions it had come up with. We are now thinking of getting it to write a grant proposal...
I think the author of this article makes a number of good points here. Writing tools exist on a continuum from traditional dictionaries to collocation software such as just-the-word.com to spelling & grammar check tools to machine translation software. AI is another step along that continuum and students still need the same support & guidance in using this new meta-cognitive strategy as they have always done.