The suggestion to rank institutions according to quality-related funding per staff member submitted to the research excellence framework is problematic, and not only because some research fields are much more expensive than others (¡°Can you win by fielding the whole team?¡±, Research intelligence, 7 May). Such a ranking is also distorted by institutional differences in staff selectivity: REF-excluded staff are awarded zero QR funding but are not counted.
An alternative, which would lead to rankings that are more informative and would also reduce the institutional tension between league table position and QR funding, is to use a ¡°quality per researcher¡± measure, in which every REF-eligible researcher is counted, and where the ¡°quality¡± score is similar to grade point average but with the GPA weights of 4, 3, 2, 1 and 0 for REF quality levels replaced by the weights that determine QR funding. This is one of the conclusions reached by a working party set up by the Royal Statistical Society to consider how rankings based on REF results could be improved.
The report of the society¡¯s working party () also makes some related suggestions for the funding councils themselves about REF submission rules and reporting; these include elimination of the distorting ¡°threshold¡± effect of the REF¡¯s formula for the number of impact case studies in each submission.
Peter J. Diggle
President
Royal Statistical Society
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±á·¡¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login