ÁñÁ«ÊÓƵ

Academic estimates ¡®real¡¯ cost of REF exceeds ?1bn

<ÁñÁ«ÊÓƵ class="standfirst">Sector challenged to share data on REF costs and come up with less ¡®disingenuous¡¯ figure than official estimates
February 12, 2015
Source: Getty

Official estimates of the cost of research assessment to universities are ¡°disingenuous¡± and the real bill could exceed ?1 billion, a senior academic has estimated.

An independent report by PA Consulting puts the total cost to higher education institutions of the 2008 research assessment exercise at ?47 million, amounting to an average of ?613,000 for each institution, or ?88,000 a year over seven years.

But according to Robert Bowman, director of the Centre for Nanostructured Media at Queen¡¯s University Belfast, that figure amounted to just one full-time senior-level salary a year, which was ¡°not even at the races¡± for an institution such as his.

Based on conversations with colleagues across the sector, Professor Bowman has produced a ¡°guesstimate¡± of the real cost of the Download a full spreadsheet of Robert Bowman¡¯s calculations

ÁñÁ«ÊÓƵ

paul.jump@tesglobal.com

Hit and miss metrics: ¡®Throw of dice would give more accurate REF prediction¡¯

An examination of university departments¡¯ ¡°h-indices¡± failed to accurately predict the research excellence framework results, a study has concluded.

Prior to the publication of the results in December, a team of physicists published their predictions in four subjects ¨C physics, chemistry, biology and sociology ¨C based on observation of departments¡¯ h-indices during the assessment period.

ÁñÁ«ÊÓƵ

Broadly, the h-index measures the number of papers that garnered a significant number of citations. Although the physicists were opponents of metrics, they wanted to examine, in a ¡°neutral¡± way, whether a department¡¯s h-index correlated with its REF score. Some advocates of replacing peer review with metrics, as a way to save universities money and effort, have pointed to supposed correlations between h-indices and results of the 2008 research assessment exercise.

The group¡¯s , Predicting Results of the Research Excellence Framework Using Departmental h-index ¨C Revisited, published on the arXiv preprint server, concludes that although correlations exist, they are not nearly strong enough to justify replacing peer review with metrics.

Most worryingly for proponents of metrics, in three of the subjects examined, the h-index correlates more closely with overall REF scores ¨C which include impact and environment elements ¨C than it does with the outputs element in isolation.

They also found that h-indices were unable to predict whether a department would rise or fall in the REF ranking. One of the paper¡¯s authors, Ralph Kenna, reader in mathematical physics at Coventry University, said: ¡°Managers would get more accurate predictions by tossing dice.¡±

ÁñÁ«ÊÓƵ

Paul Jump

<ÁñÁ«ÊÓƵ class="pane-title"> Related files

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (4)
I¡¯ll make a few comments in addition the article. 1. I put the spreadsheet together over the equivalent of a couple of coffee breaks. No doubt it could be refined and improved upon. Indeed, it is mildly diverting, as a physicist, to play around with the %¡¯s and see which elements rise and fall. To my mind, as calculated, impact statements seem like VFM given their power in convincing BIS/Treasury of the utility of the UK research effort. 2. I was trying to get a feel for the opportunity cost of the whole REF exercise. Many people I have spoken to shake their heads at citing of the previous PA report figure for RAE2008. 3. As quoted I believe REF serves a purpose but would hope the sector collectively can inform the costing from the bottom up. REF will award circa ?1.75B p.a. for the next 5-6 years¡­ ?8B+ is the cost <?0.1B, ?0.5B or more? 4. The impact studies are good illustration of my order of magnitude methodology used in spreadsheet¡­ if you contend that they might be a 1/10th of current figure used in my spreadsheet estimate then they would cost ~?10M of the propertied ?80-120M cost of REF. But was each impact statement and all each entailed delivered in just 16 hours of effort! Really? OK, so maybe a weeks work¡­. Or more? Over a couple years, 160 hours doesn't seem outrageous and that¡¯s before we recognise I omitted the cost of the impact statements that never saw light of day as UoAs optimised staff numbers around their best impact statements. 5. Everyone has a view. From the single academic (just how much time did you have to expend at various stages thinking, selecting, justifying, checking your outputs, income etc. etc.) to the whole effort doing impact statements (paid writers, engaging external readers, purportedly). Likewise, for all the preparation and validation of the UoA submission in its entirety by someone like a ¡®champion¡¯ and/or UoA team? And then the harder to estimate institutional cost (from a small % of time of a VC & PVCs right down through Research Support Offices to the back-office computer support to ensure databases and documents all load into REF submission software. 6. All these figures will vary across institutions large and small and Main Panel by Main Panel but to counter the comment from Graeme Rosenberg in the article my guesses are not based solely on a) my own experience or even b) that of my own institution. But by some common sense guesstimates (not just my own) and comment and chatter we all pick up in the community of how all institutions do things differently. From the relatively hands-off, those with UOA reading teams, to those where people tell me they were individually interviewed on their outputs choices. There is a wide diversity of experience out in the community and it would be good if this might start to capture and enumerate the cost. Happy calculating! p.s. there are lost of arguments for/against using TRAC, choosing salary levels, proportions of professorial staff, equating admin grades to academic salary (single pay spine) etc etc. - all variable for people to play with.
I am not persuaded the entire cost of impact statements should be assigned to REF. If they are good, they reused for many other purposes.
Audit
An interesting analysis which challenges the assumption that its a near zero sum activity. Even if the costs are at the lower end, which I doubt, there is surely a powerful case for re-examining the value of the REF as we can now see that pretty well all the objectives set out in the Roberts Review have been missed as the UK Universities outside of the Golden Triangle continue to drop down international rankings. The latest THES ranking did not have an English University outside Oxbridge and London in the top 50. Equally importantly the UK's share of international research funding from the private and public sector continues to drop. We are seeing a serious US style division emerge in Higher Education where a tiny elite of "Ivy League" type institutions suck in endowments, research funding while pressing for the power to charge premium fee levels to reinforce their advantages - and the devil take the hindmost
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs