University researchers can still play a?vital role in?putting artificial intelligence on a?more ethical track despite concerns that academia has been left behind by?corporate giants in?the ¡°out-of-control race¡± to?create and roll out potentially era-changing technology, experts have argued.
With industry racing ahead of?academia on?AI thanks to?unprecedented research spending (estimated at $91.9?billion, or ?73?billion, globally in 2022, according to Stanford University¡¯²õ ), questions have been raised about whether universities will exert any meaningful influence on?powerful new products developed largely by?private companies. Anxieties about the technology were further fuelled when the British-born computer scientist Geoffrey Hinton, often described as?the ¡°godfather of?AI¡±, , saying he?regretted his life¡¯²õ work as?AI made it?hard to?stop ¡°bad actors from doing bad things¡±.
Despite being outgunned on research spending by a huge magnitude, academics were needed to help the public understand?AI, to imagine where the technology might lead and to shape regulation to counter its negative effects, said Anil Seth, professor of cognitive and computational neuroscience at the University of Sussex.
¡°There are many critical questions about the impact of this technology ¨C on jobs, our political system or in terms of human psychology ¨C that we¡¯ve only just started thinking about,¡± said Professor Seth, a signatory to a? calling for a six-month pause on the roll-out of advanced AI systems, citing the ¡°profound risks to society and humanity¡± posed by the ¡°out-of-control race¡± to deploy new machine learning systems.
ÁñÁ«ÊÓƵ
¡°That fundamental question of how to interact with something that seems conscious but isn¡¯t is huge as it could be socially disruptive in ways we haven¡¯t properly considered,¡± added Professor Seth.
¡°We don¡¯t have an idea of where it might lead, so universities are good places for social scientists, psychologists and lawyers to come together and better understand its effects,¡± continued Professor Seth. ¡°I?don¡¯t see anywhere else that can do that ¨C if we don¡¯t do this, policy will emerge from focus groups, government departments or simply come from arbitrary decisions of corporate entities,¡± he said.
ÁñÁ«ÊÓƵ
While universities worldwide have struggled to create their own machine learning models (just three significant models were produced by academia in 2022, compared with 32 from industry, says Stanford¡¯²õ AI?Index), smaller ¡°university-scale¡± research on refining existing systems might ensure that ¡°AI?acts according to human values¡± rather than ¡°potentially dangerous¡± self-created objectives, he added.
Carissa V¨¦liz, a philosopher at the Institute for Ethics in?AI at the University of Oxford, said academic input into regulation was essential because, unlike other technologies such as gene editing, AI¡¯²õ applications were being decided entirely by corporations.
¡°We¡¯re seeing technology companies roll out incredibly powerful products in whatever way they choose, leading to potentially huge consequences, but they don¡¯t have to pick up the bill. It¡¯²õ outrageous,¡± she said.
¡°Researchers who are independent experts can speak truth to power in a way that those with commercial interests can¡¯t,¡± added Dr V¨¦liz, whose institute was founded thanks to a ?150?million donation in 2019 by the billionaire Blackstone financier Stephen Schwarzman.
ÁñÁ«ÊÓƵ
That funding ensured that Oxford¡¯²õ AI research would not be compromised by financial ties to AI companies, which now routinely sponsor academic work, said Dr?V¨¦liz. ¡°I?really value that [independence] as I¡¯m concerned that more and more research in?AI, particularly around ethics, is funded by big tech.¡±
Alan Winfield, professor of robot ethics at the University of the West of England, said it was not surprising to see university-based AI fall behind industry given the ¡°vast resources¡± deployed by private companies. ¡°The energy costs for running a big natural language system can run into several million dollars ¨C for just one training round. How many universities can afford that?¡± he said.
¡°I have friends at DeepMind [Google¡¯²õ AI research laboratory], and in practical terms, they have an infinite research budget,¡± continued Professor Winfield.
Nonetheless, academics can play a useful role in formulating industry standards, like those created by the Institute of Electrical and Electronic Engineers for wi-fi use or those overseen by the British Standards Institution, Professor Winfield said.
ÁñÁ«ÊÓƵ
¡°Regulation is hard unless you have standards and see what¡¯²õ happening ¨C that¡¯²õ how the Civil Aviation Authority can certify an aircraft as fit to fly because the manufacturer can demonstrate that the parts and processes meet certain standards. For?AI, there are standards and methods to judge them that already exist, but the level of opacity is shocking ¨C ethical research assessment could happen, so it¡¯²õ frustrating that it doesn¡¯t,¡± he said.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login