More than 50 of the world¡¯s top artificial intelligence experts have threatened to boycott Korea¡¯s second highest-ranked university over its alleged involvement in developing ¡°killer robots¡±.
AI and robotics researchers from the University of Cambridge, Cornell University, the University of California, Berkeley and 52 other institutions have hatched plans to stop all contact with the Korea Advanced Institute of Science and Technology (KAIST) over a new research centre.
In an open letter, the group cited media reports that the ¡°Research Centre for the Convergence of National Defence and Artificial Intelligence¡± would be involved in the development of autonomous armaments.
¡°At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST looks to accelerate the arms race to develop such weapons,¡± the letter says.
ÁñÁ«ÊÓƵ
¡°We publicly declare that we will boycott all collaborations with any part of KAIST until such time as the president of KAIST provides assurances ¨C which we have sought but not received ¨C that the centre will not develop autonomous weapons lacking meaningful human control.¡±
In a statement, KAIST president Sung-Chul Shin said that he was ¡°saddened¡± by the threatened boycott and denied that the institution had any intention to work on lethal autonomous weapons systems.
ÁñÁ«ÊÓƵ
¡°The centre aims to develop algorithms on efficient logistical systems, unmanned navigation [and an] aviation training system,¡± he said. ¡°KAIST will be responsible for educating the researchers and providing consultation.
¡°As an academic institution, we value human rights and ethical standards to a very high degree. KAIST will not conduct any research activities counter to human dignity, including autonomous weapons lacking meaningful human control.¡±
The boycott¡¯s organiser, Toby Walsh, Scientia professor of artificial intelligence at the University of New South Wales, said that he would consult his co-signatories about KAIST¡¯s statement. He told?Times Higher Education?that he had been seeking clarity on the centre¡¯s activities since early March and that, while KAIST had finally responded, it had left ¡°some questions unanswered¡±.
He said that he had been advised that the centre was working on four autonomous weapons projects, including a submarine.
ÁñÁ«ÊÓƵ
Professor Walsh said that KAIST¡¯s partner in the centre, Korean arms company Hanwha Systems, also posed ¡°serious warning signs¡± about the initiative. He said that Hanwha had been blacklisted for producing ¡°cluster munitions¡±, which are prohibited under a United Nations convention, although South Korea is not a signatory.
The clash reflects broader disagreement over the danger posed by robots. More than 100 specialists, including Tesla founder Elon Musk, last year demanded an outright ban on autonomous weapons, warning that the technology threatened to spawn a ¡°third revolution in warfare¡± after gunpowder and nuclear arms.
In November, the UN began formal discussions on a possible ban on autonomous weapons through a?. It next meets in Geneva on 9 April.
Professor Walsh said that while AI had ¡°useful¡± military applications, autonomous weapons would be catastrophic. He said that the main arguments in their favour ¨C that robots could be more ethical and discriminatory than humans ¨C had fundamental flaws.
ÁñÁ«ÊÓƵ
He said that while it might be possible to program computers to behave ethically, ¡°there¡¯s no program that can¡¯t be hacked. Whatever safeguards we put in will be removed by bad actors, North Korea being one of them.¡±
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±á·¡¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login