榴莲视频

Crisis of confidence: scholars unconvinced by REF impact pilot

<榴莲视频 class="standfirst">Work that is controversial or critical of government may be stifled, critics fear. Matthew Reisz reports
六月 10, 2010

A pilot project to measure the economic and social "impact" of research in the forthcoming research excellence framework has exposed a host of problems and raised fears that the measure could stifle controversial work or research that is critical of the government.

Participants in one of several official trials said they were "far from confident that the assessment process would be able to command the confidence of academic communities" in its current form.

Their concerns echo those expressed across the sector about plans to give assessment of the impact of research a weighting of up to 25 per cent of departments' overall scores in the REF. The results of the REF will be used to allocate about ?1.5 billion a year in research funding in England.

The latest concerns emerged from the pilot of the REF methodology in social policy and social work, which involved 11 universities. This was one of several subject-specific pilots set up by the Higher Education Funding Council for England.

The findings were based on a seminar for participants in the pilot held at the University of Leeds, organised by the Social Policy Association and the Joint University Council Social Work Education Committee.

Participants reported that pilot assessments had been "very labour-intensive, especially when it came to contacting research end-users to verify claims of impact ... Records of some end-users ... were simply unavailable". They predicted that this would prove to be a severe problem for projects bringing together large cross-disciplinary teams, warning that the REF may discourage such activity.

Some researchers reported that their universities were wary of submitting research that had an undeniable impact, either because it was "carried out for business" - and therefore commercially sensitive - or because it was "controversial or critical of government".

More general criticisms were also voiced, for example about "a simple, mechanistic causal process" linking a research project with its impact. Participants were sceptical about how far researchers, universities or departments were "able to influence or have any control over the influence or impact of their research".

Worries were also expressed about the coherence of the concept of "impact". Participants felt they needed guidance on the precise distinction between "scientific quality" and "research impact"; suspected that end-users adopted different definitions of impact from "those used by academics, Hefce or the research councils"; and were not even clear whether "assessment of research impact under the REF is entirely compatible with (research council) requirements".

Bahram Bekhradnia, director of the Higher Education Policy Institute, said it was "absolutely essential that whatever measures are introduced carry the confidence of the academic community".

David Willetts, the universities and science minister, has said that he would consider delaying the framework's implementation as a result of researchers' concerns.

Mr Bekhradnia said: "One of the great strengths of the research assessment exercise was ... academics at least recognised that the process by and large identified high-quality research. If the new REF does not carry that conviction, then it will fail."

Yet he also suggested that "one of the strengths of the proposed approach is that it is non-prescriptive, and leaves it up to universities and academics themselves to decide how they will persuade panels of the impact of their work. The fact that there is no single approach may not be a decisive problem."

matthew.reisz@tsleducation.com

READY FOR THE BIG DAY? HERE COMES THE BRIDE

As researchers struggle to work out how "impact" can be defined and measured, a team at Brunel University has developed what it describes as "a broad and flexible tool for research-impact evaluation".

Known as Bride (Brunel Research Impact Device for Evaluation), it is designed to assess impact beyond the academy and has already been piloted on research Brunel submitted to the 2008 research assessment exercise.

It uses a four-point scale to assess impact on the two dimensions of "depth" and "spread" - parallel to the criteria of "significance" and "reach" laid down by the Higher Education Funding Council for England.

Bride then generates a rating by a simple process of multiplication.

Geoff Rodgers, pro vice-chancellor for research at Brunel, said: "Hefce may not care about the details (or consistency across disciplines), provided that impact is clearly and openly assessed.

"We have shown one way to do it. As no one has yet proposed how it should be done, it could be adopted by default."

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT