The UK’s research excellence framework (REF) should replace peer review in some scientific disciplines with citation-based assessments, the architect of the country’s first national research audit has recommended.
As hundreds of expert assessors across 34?sub-panels begin the year-long task of grading tens of thousands of research outputs submitted to the 2021 exercise, the issue of whether this onerous task of peer review – which cost ?19 million in panellists’ time in 2014 – could be replaced with less bureaucratic, costly and time-consuming process has again been discussed.
It follows several successful attempts by researchers to replicate the assessment, held every six years to determine university budgets, using only bibliometric data: one 2018?, which analysed the 6.95 million citations connected to the 190,000 outputs to the 2014 REF, claimed it was able to correctly predict top-ranked universities in 10 mainly science-based units of assessment with an 80 per cent level of accuracy.
Using this kind of bibliometric analysis would save countless hours of academic labour, it said, with the members of the REF’s expert panel for physics in 2014 having to read at least two papers a day every day for 10 months to get through the 6,446 outputs submitted for this discipline. Other panel members would face an even higher number of outputs, which now account for 60 per cent of assessments, it added.
Rama Thirunamachandran, vice-chancellor of?Canterbury Christ Church University, who developed the 2008 research excellence exercise – the forerunner of the REF – while he was director of research, innovation and skills at the Higher Education Funding Council for England, told?Times Higher Education?that he had believed future incarnations of the REF could successfully use metrics in place of peer-review panels.
“For some disciplines, a more mechanistic approach looking at bibliometric information might allow us to make valid assessments of outcomes,” said Professor Thirunamachandran, who added that these “studies show this broad-brush approach can work quite well”.
“In biosciences or chemistry, bibliometrics could act as a proxy for peer review, though for arts and humanities, and social sciences, it would be quite difficult to do as [metrics] are not robust enough.”
With a government-commissioned?review?of research bureaucracy under way following?criticisms?by the prime minister’s former chief of staff, Dominic Cummings, that universities are a “massive source of bureaucracy”, a move to metrics-based assessments in some disciplines has long been seen as a potential way to reduce red-tape costs, with the 2014 framework costing an estimated ?246 million to universities and funding bodies.
But Professor Thirunamachandran said he believed there were other areas of research that could yield more substantial savings in terms of bureaucratic costs than the REF.
“It’s an exercise that takes place every six to seven years, whereas the bureaucratic burden is much higher for those constantly bidding for research funding – that is quite significant, particularly when the level of applications not getting funding is quite high,” he said, adding he would like to see longer grants awarded to successful applicants to ease this strain.
Dorothy Bishop, professor of developmental neuropsychology at the University of Oxford, who has argued that departmental h-indexes could be used instead of expert panels in some subjects, told?THE?that it was time to “ditch the ridiculous current system”.
“It’s highly stressful and a ridiculously inefficient system: mountains of effort for very little, if any, marginal gain over a simpler approach,” she said, adding that?“hours of time have been spent on mock REFs before we even got to the real thing”.