ÁñÁ«ÊÓƵ

Korean university ¡®killer robots¡¯ boycott called off

<ÁñÁ«ÊÓƵ class="standfirst">Robotics experts cancel boycott after KAIST denies developing autonomous weapons
April 9, 2018
A robot wearing a sign saying 'Campaign to Stop Killer Robots¡¯
Source: Getty

Artificial intelligence experts have called off a threatened boycott of one of Korea¡¯s top universities, after it undertook not to use a new research centre to develop so-called killer robots.

Fifty-six AI and robotics researchers said they would maintain academic contacts with the Korea Advanced Institute of Science and Technology (KAIST) after its president, Sung-Chul Shin, gave public assurances about the activities of its Research Centre for the Convergence of National Defence and Artificial Intelligence.

Korean media reports had suggested that the centre was working on autonomous weapon projects including ¡°AI-equipped unmanned submarines and armed quadcopters¡±.

Researchers also had concerns about KAIST¡¯s partner in the centre, Korean arms company Hanwha Systems, which they said had developed cluster munitions and an autonomous ¡°sentry¡± robot called the SGR-A1.

ÁñÁ«ÊÓƵ

The experts aired their concerns in an open letter scheduled for release on 4 April. However, KAIST vehemently denied any intention to work on autonomous weapons systems after it was contacted by Times Higher Education.

The researchers said they had now accepted KAIST¡¯s guarantees. ¡°Given this clear commitment, the signatories to the boycott have rescinded the action,¡± organiser Toby Walsh,?Scientia professor of artificial intelligence at the?University of New South Wales, said in a statement.

ÁñÁ«ÊÓƵ

¡°They will once again visit and host researchers from KAIST and collaborate on scientific projects.¡±

KAIST was ranked 95th?in the world, and the second top Korean institution behind Seoul National University, in this year¡¯s THE World University Rankings. Professor Walsh described it as ¡°the MIT of Korea¡±.

He added that AI had legitimate military applications. ¡°No one, for instance, should risk life or limb clearing a minefield ¨C this is a perfect job for a robot,¡± he said.

¡°But we should not hand over the decision of who lives or who dies to a machine. This crosses an ethical red line and will result in new weapons of mass destruction.¡±

ÁñÁ«ÊÓƵ

A United Nations is meeting in Geneva this week to discuss the humanitarian and international security challenges posed by lethal autonomous weapons systems. Twenty-two participating nations have backed a call for a pre-emptive ban on such weapons.

john.ross@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Related universities
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs