Three-quarters of researchers routinely use artificial intelligence but only a tiny proportion trust technology companies not to reuse their data without permission, a major poll of scholars has revealed.
New research by Oxford University Press (OUP), which surveyed more than 2,300 researchers, found that 76 per cent use some form of AI tool in their research, with machine translations and chatbots cited as the most popular tools, followed by AI-powered search engines or research tools. AI is most used for discovering, editing and summarising existing research, the report found.
However, only 8 per cent of researchers trust that AI companies will not use their research data without permission while just 6 per cent believe AI companies will meet data privacy and security needs, says the study published on 23 May.
The study comes amid widespread concern in the publishing industry that AI tools are lifting and reproducing academic texts in a different format without proper attribution ¨C a situation that will undermine long-established copyright and intellectual property norms for journals and scholars.
ÁñÁ«ÊÓƵ
According to the OUP study, about three in five feel that the use of AI in research could undermine intellectual property, and result in authors not being recognised appropriately for use of their work.
Campus?podcast: how to use generative AI in your teaching and research
The poll, which drew responses from different disciplines, career stages and from countries across the world, found that 25 per cent of researchers believe AI will reduce the need for critical thinking and could damage the development of these fundamental skills for the future.
ÁñÁ«ÊÓƵ
David Clark, managing director of OUP¡¯s academic division, said the research will help the university publisher to ¡°understand how researchers are thinking about generative AI and its use in their work¡±.
¡°As these technologies continue to rapidly develop, our priority is in working with research authors and the broader research community to set clear standards for how that evolution should take place,¡± said Mr Clark on what he called a ¡°fast-moving, complex area¡±.
"We are actively working with companies developing LLMs [large language models], exploring options for both responsible development and usage that will not only improve research outcomes, but also recognise the vital role that researchers have ¨C and must continue to have ¨C in an AI-enabled world,¡± he added.
The poll also asked researchers about institutional attitudes towards AI, with almost half (46 per cent) of researchers saying that the institution they work at has no AI policy.
ÁñÁ«ÊÓƵ
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±á·¡¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login