I have moved on from believing all attempts to measure impact to be part of a neoliberal conspiracy intended to crush academic freedom and suppress critical voices. I?have been a?trustee on enough grant-giving bodies and have sat on enough senior management committees to know why it is necessary to get a?handle on what happens to money allocated for research.
Most submitted to the 2014 research excellence framework scored highly, and this was widely interpreted as showing that UK higher education was producing ¡°value for money¡±. But I?know ¨C from my own experience of assessing impact in the REF and from my research (with Nick Fahy) into the case studies submitted to the public health, health services and primary care subpanel ¨C that we are failing to capture large swathes of impact.
Our research, ¡°Research impact in the community-based health sciences: an analysis of 162 case studies from the 2014 UK Research Excellence Framework¡±, , found that direct, short-term and surrogate impacts featured far more commonly in case studies than indirect, longer-term or definitive ones. A surrogate impact is one that does not benefit anyone directly, such as changing a clinical guideline to recommend drug?A for condition?C, as opposed to the definitive impact that occurs when a patient with condition?C lives longer or has better quality of life as a result of?taking drug?A. More than three-quarters of case studies in our sample described a change in a clinical guideline, while just 15?per cent claimed an improvement in survival and only 27?per cent documented any improvement in any measure of sickness.
Some claim that short-term, surrogate impacts will eventually produce longer-term, definitive ones, but I do not buy that argument. The REF rules permit 20 years between research and its subsequent impact ¨C more than long enough to allow many definitive impacts to be captured. So why did universities shy away from them? My view is that the case study format (compounded by the process by which studies were constructed, refined and selected) favoured the unambiguous but deeply unexciting narrative: ¡°Study?X led directly to measurable change?Y.¡± The closer in time a change is to the research that drove it, the more solid the causal link will appear to be.
ÁñÁ«ÊÓƵ
In a recent edition of Times Higher Education, a forensic chemistry lecturer from Teesside University described how she spends her summers working with students on crime scene-based workshops for children at music festivals (¡°Lazy, hazy, crazy days¡±, Features, 17?September). What non-linear chains of causation might her efforts have set in train? The 11-year-old inspired to take her first steps towards a career in forensic science; the seminal conversation that bubbles up among students while striking camp; the newspaper feature that leads to a tentative pilot collaboration with a third-sector organisation. Such small but telling moments will all help to build Teesside¡¯s ¡°impact¡±, but not in a way that can be nailed in a simple ¡°research project?X led to impact?Y¡± narrative. Tellingly, only 11 of the 162?case studies in our sample included any reference to civic engagement.
But there is ample opportunity for the case study format to be tweaked to permit credit to?be given for efforts to promote understanding of science, the arts and the humanities among the public and policymakers, even if those efforts precede the planning of a specific piece of research (indeed, such engagement may be a good way of setting research priorities).
ÁñÁ«ÊÓƵ
Credit should also be given to attempts to?achieve impact even when it cannot be traced through to actual, measured results. In?particular, we should value for their own sake the development and maintenance of reciprocal links with policymakers and industry, and knowledge translation work designed to craft key messages for particular audiences. A?mature research group is typically embedded in a complex network of relationships and interactions and puts ongoing work into developing synergies that set the stage for further positive interactions. Measuring these will not be straightforward, but it is unlikely to be impossible.
We should also credit cases in which research and impact emerge in parallel through co-creation with industry or community groups (such that the research paper might be published after the impact occurs). Metrics could give credit to partnership-strengthening activities such as staff exchanges and efforts to build shared governance.
Last, we need to develop ways of spotting tokenism. For example, patient involvement in?medical research is widely promoted and well-intentioned, but it is often merely instrumental, aimed at increasing recruitment to trials designed by researchers, rather than truly democratic, giving patients a say in what trials are done.
I¡¯m sure that others will have further ideas, and it is important that we have a vigorous debate on how to proceed. Otherwise, many of the greatest impacts that higher education has on health, wealth and society will continue to go unmeasured and unrewarded.
ÁñÁ«ÊÓƵ
Trisha Greenhalgh is professor of primary care health sciences at the University of Oxford and was deputy chair of the 2014 REF Main Panel?A. The opinions expressed here are her personal views.
Print headline: The story of impact is rich and complex; we must do more to tell it
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±á·¡¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login