榴莲视频

Cautious experiment with AI in student counselling gets under way

<榴莲视频 class="standfirst">US university tests use of technology in training mental health advisers, but experts foresee – and fear – wider application
四月 24, 2024
Visitors look at the installation aclled Doing Nothing With AI (2019) by Emanuel Gollob during a media preview to illustrate Cautious experiment with AI in student counselling gets under way
Source: Suhaimi Abdullah/NurPhoto/Getty Images

A US university is experimenting with the use of artificial intelligence tools in training mental health counsellors, amid warnings that the mounting crisis in student well-being could force the field into greater embrace of technological solutions.

Arizona State University’s partnership with ChatGPT creator OpenAI – the company’s first with a US higher education institution – includes an effort to create AI-based subjects that counselling students can use to test their skills.

The encroachment of AI into the sensitive student mental health space has been seen as likely to generate some ethical concerns and ASU’s famously pioneering president, Michael Crow – a clear advocate of the overall OpenAI relationship – said that he had seen the problems with AI, including its fabrication of facts and its difficulties in areas of social and cultural affairs.

“If it doesn’t solve the problem better, enhance the outcome better, help the student better, we’re not going to do it,” Dr Crow said of counselling-based applications.

Outside experts expressed similar concerns, but said that the nationwide crisis in mental health – including among college students – might be changing the underlying calculations of value.

Sarah Ketchen Lipson, an associate professor of health law policy and management at Boston University, said she was yet to be convinced that bringing AI into human mental health counselling was worth the risks. But the need has reached the point, Dr Lipson said, “that I understand the motivation for considering new approaches to meet mental health needs”.

Victor Schwartz, the senior associate dean of medicine for wellness and student life at the City University of New York, said that some use of AI tools in therapy programmes seemed inevitable, given that “there are a ton of companies out there trying” to make that happen.

ASU, at least for now, appears to be limiting its ambitions to the training of future mental health counsellors, by giving such trainees a series of verbal prompts – such as presenting the kinds of comments that an anxious person or a depressed person might make – and leaving the student to make a diagnosis.

That’s different from actually using AI tools on patients, Dr Schwartz said. Such a limited experiment, he said, “is a little bit of a different and, in some ways, if it’s handled correctly, not a terrible idea”.

Various studies and surveys have concluded that US higher education students have been?suffering for several years?with?record rates of poor mental health, with conditions worsening significantly during the pandemic, overwhelming the capacity of institutions to respond.

Yet part of the reality, Dr Schwartz said, is that students can probably get a counselling appointment on most US campuses faster than Americans can elsewhere.

But while college campuses may provide plenty of potential test subjects should a place like ASU choose to expand its experiments with AI-based mental health innovations, that’s probably not a good future direction, he said.

The college population of young people “is so volatile”, Dr Schwartz said. Given the importance of acting quickly to help students with mental health challenges stay enrolled, he said, “you really want to have the most robust, nimble and thorough services you can, that are responsive, and not have people waiting and talking to a machine.”

paul.basken@timeshighereducation.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.