A number of top research departments performed unexpectedly poorly in the first official attempt to measure the impact of academics' research, new data reveal.
Some lower-rated departments also did conspicuously well in the Higher Education Funding Council for England's pilot impact assessment exercise.
Reports summarising the reactions of the institutions and the judging panels to the exercise were released in November, but the results themselves were held back because, according to Hefce, they were "not relevant" to the general conclusion that a case study-based approach to assessing impact was "workable".
In the exercise - held to test controversial plans to include an impact rating in 2014's inaugural research excellence framework, which could be worth up to 25 per cent of the marks - 29 universities were asked to submit a case study for every 10 scholars working in two out of the five subjects being assessed.
ÁñÁ«ÊÓƵ
In social work and social policy, the London School of Economics achieved the best result, with 70 per cent of the material it submitted rated 4*, the highest grade, defined as "exceptional". The LSE was also the top performer in the subject in the 2008 research assessment exercise, according to Times Higher Education's analysis of the results.
In English language and literature, Queen Mary, University of London (rated second for the subject in the RAE 2008) also performed strongly in the impact pilot, with 40 per cent of its submission rated 4* and 60 per cent 3*, defined as "excellent".
ÁñÁ«ÊÓƵ
But the University of Manchester, judged the fourth-best department for research in English in the final RAE, saw 80 per cent of its impact submission rated only 1*, or "good".
A Manchester spokesman said the focus of its English submission was on "learning about the mechanisms of assessment...Although the results can be seen as disappointing, they have given us an opportunity to learn far more about what is expected from an impact statement."
Lancaster University's English department, ranked in the middle of the RAE 2008, achieved a notably strong result, with 35 per cent of its impact submission rated 4* and 50 per cent 3*. But its top-rated physics department did relatively poorly, with 95 per cent of its impact submission rated only 2*, defined as "very good". A spokesman for the university said Lancaster had also approached the pilot exercise as "an opportunity to test the system".
The University of Cambridge's physics department, rated second in the RAE, fared better, with 30 per cent of its impact submission in physics deemed 4*. The strongest performance in physics came from Liverpool John Moores University, which achieved a 40 per cent 4* and 45 per cent 3* impact profile.
David Carter, professor of observational astronomy at Liverpool John Moores, said he believed Hefce had found a sensible and proportionate approach to the agenda, but warned that it was difficult to gather impact evidence retrospectively.
"You need to consider what the impact is of your research while you are doing it and write it down: it is fairly basic stuff," he said.
In earth systems and environmental science, Brunel University achieved the highest impact score, with 50 per cent of its submission rated 4*, although the remainder went unclassified.
ÁñÁ«ÊÓƵ
The results for the other subject assessed, clinical medicine, cannot be directly compared with the RAE 2008 results.
ÁñÁ«ÊÓƵ
Hefce's introduction to the impact profiles notes that a lack of evidence in some case studies had "significantly affected" scores.
For this reason, plus the deliberate experimentation in some of the submissions, the results "should NOT be read as a clear judgement about the impact of research from the submitting departments, or as a means of predicting the impact profiles departments may be expected to achieve in the real REF", the funding council says.
The government is pressing ahead with plans to assess the impact of academics' research, despite the efforts of campaigners.
In its grant letter to the Higher Education Funding Council for England last month, the coalition welcomes the progress the funding council has made in developing the impact element of the research excellence framework and directs it to work with the research councils to deliver the agenda "coherently".
Don Braben, honorary professor of earth sciences at University College London, who coordinates a group of 50 prominent impact critics, admitted that David Willetts, the universities and science minister, "seems to have been convinced by his officials of the policy's soundness".
He noted that Mr Willetts had failed to respond substantively to evidence his group had submitted of the damage it believed the impact agenda was doing to scientific creativity.
He also criticised Mr Willetts for suggesting, in a BBC Radio 4 documentary broadcast last month, that opposition to impact reflected an overly "prissy" concern for academic purity.
At a seminar at the University of Manchester last week, Helga Nowotny, president of the European Research Council, said that funders were misguided to focus on impact because it was impossible to predict most of the outcomes of research.
ÁñÁ«ÊÓƵ
"If you focus only on what you can see, you eliminate a lot of possible benefits," she told Times Higher Education.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±á·¡¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login