榴莲视频

Regular diet of metrics 'lite' may make full REF more palatable

<榴莲视频 class="standfirst">Hefce submissions call for less onerous approach to research assessment
十一月 13, 2014

Source: Getty

?

The next research excellence framework should be used to test the potential for a light-touch metric-based assessment exercise to complement a less frequent version that also incorporates peer review.

That is the suggestion included in one of the submissions to the Higher Education Funding Council for England’s independent review of the role of metrics in research assessment.

The majority of the 153 submissions – including 67 from universities or departments – are hostile to the idea that metrics could play a larger role in research assessment than they currently do.

According to Hefce’s summary of the responses, many respondents, including the universities of Oxford and Cambridge, raised concerns about metrics’ robustness, relevance outside the sciences and potential negative effect on early career researchers and women.

Many also worried that citation counts were easily “gamed” by techniques such as excessive self-citation.

However, some institutions were more open to metrics. Imperial College London suggested that, provided the equality issues were dealt with, a “lighter touch” REF could make use of a “basket of metrics” normalised for disciplines and “contextualised through expert peer review”.

Imperial’s submission also suggests that analysis of the impact element of the 2014 REF – which it describes as a “particularly significant burden on the academic community” – could “reveal to what extent the information could have been captured by metrics”.

“A metrics-based exercise has the additional benefit of assessing the whole research output of an institution or of a subject area within an institution, which also makes it easier to identify ‘gaming’,” it says.

The University of Southampton goes even further, suggesting that while the current REF “adds value”, it does not do so “in a commensurate way with the effort expended”. Southampton’s submission suggests greater use of metrics could both improve “cost-effectiveness” and be seen as “a fairer and more objective method of assessment” than the current reliance on peer review.

Metrics relating to research income, PhD numbers and awards of doctoral training centres and scholarships “could readily replace most of the qualitative assessment of the research environment element of the REF, which currently requires a disproportionately large element of time for preparation by institutions”, it says.

Meanwhile, the use of bibliometrics to assess the quality of outputs should also be increased where analysis suggests that metrics correlate closely with the quality profiles determined by the peer review panels – likely to be in “the majority of STEM subjects”. The submission says that where large volumes of outputs are examined, the effects of gaming, age and gender profiles and intra-disciplinary differences in citation patterns are reduced.

Southampton suggests that a “metrics-based assessment, say every five years, at institutional or departmental level…could provide an interim check on performance, with a more balanced quantitative and qualitative assessment [taking place] at longer intervals”.

Running such a metrics-based exercise instead of another full REF in 2020 would “allow for the re-evaluation of the use of bibliometrics in the future, when data are likely to be more reliable across a larger number of disciplines”.

paul.jump@tesglobal.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
<榴莲视频 class="pane-title"> Reader's comments (1)
Before going for metrics, please imagine the stream of emails pressurizing you to increase your number of citations (or even, god forbid, your altmetric score), by fair means or foul.