The obvious question regarding the UK government’s intention, if it wins the 12 December?election, to set up a new research agency is: why? What would an agency modelled on the US Defense Advanced Research Projects Agency (Darpa) do differently? In particular, why does the government intend to place the agency outside the framework of UK Research and Innovation, the still-new umbrella body established in 2016 to act as an overall manager of the UK’s research and innovation system?
Over their lifetimes, the seven thematic research councils that were folded into UKRI established a strong and clear set of values and earned a lot of respect both nationally and internationally. This is why their brands and roles (although not their legal status) were retained in the UKRI structure.
As chief executive of the Science and Technology Facilities Council (STFC) between 2011 and 2016, I was proud to be responsible for a funding agency that applied clear, rigorous and transparent criteria to decisions. We supported only research that was internationally excellent, while at the same time devoting resources to facilitate its broader impact through public engagement and translation into innovation and industry.
The UK’s is an impartial, unbiased funding system in which young scientists can succeed through the quality of their own proposals, without powerful patrons. The councils’ highly professional (and chronically under-recognised) programme managers work hard to build connections with universities, advocating for – and, where necessary, acting as critical friends of – the research community.
All this makes the UK highly attractive as a place to work and delivers high-quality outputs – as measured by (for example) bibliometrics. Nevertheless, my years within the system alerted me to some of its limitations. First, it can be rather closed and self-referential. Excellence is defined as that which we ourselves consider to be excellent. Often, I have a nagging feeling that the UK economy and people don’t benefit as much as they should. What if this measure of excellence is not what others want – or need? Are we sure they are wrong?
Second, the research councils can be extremely conservative. Instinctively, they only trust one mode of decision-making: soliciting or receiving proposals, independently reviewing them, and then funding those ranked highest. This process is rigorous but can be awfully slow – and the impact the funding treadmill has on academic careers and workplace stress is becoming increasingly apparent.
Moreover, left to itself, the system can easily lead to an ever-greater concentration of support in universities that are already well funded and, therefore, have the resources to attract better people. Could there be other legitimate measures of excellence besides peer review? What about trying to allow excellence to take root in places where it does not yet exist?
The research council system accumulates vast amounts of data on the funding it has allocated but is not able to use any of it to, for example, direct the next round of funding decisions towards different geographical outcomes, because that would be interfering. Only the proposal counts.
The UK system can also be uncomfortable making the long-term commitments needed to sustain national research capabilities outside universities, be they research facilities such as the Diamond Light Source or UK membership of international projects such as Cern. This was a constant issue for us at the STFC, where we were very conscious of the risk of making capital investments that ended up without sufficient operational funding (the notorious “batteries not included” phenomenon). To some of my colleagues in other councils, long-term commitments were a trap best avoided. It sometimes seemed as if any money that didn’t end up in a university was money they considered to have been wasted.?
For this reason, when UKRI was created, I argued quite strongly that the opportunity should be taken to establish a new entity within it to manage the research councils’ large-scale research infrastructures. For whatever reason, this was not taken forward, but I still think it is a good idea.
The research councils are well-optimised tools for doing a certain job, but, like the man with the hammer treating everything as a nail, they are less well optimised for other important tasks. A new Darpa-style agency would clearly require different modes of decision-making and strategy, not based on peer review but trying to be challenge-led and more agile. However,?I see no reason that various approaches couldn’t all coexist within UKRI, as long as the body’s role were clearly understood to be stewardship of the whole system: ownership, if you like, of a diverse box of tools. UKRI, to its credit, has taken to this role in some domains, embracing the industrially focused mission of Innovate UK, which is also under its umbrella, and establishing policies to deal with academic bullying and to promote open science. But I suspect, at core, it still sees itself as more of a research funder than a steward of this kind of diverse toolbox.
With an election looming, talk of big funding increases and the search for a new chief executive about to begin, the signal that UKRI is not seen by politicians as the best steward of a new flagship initiative should worry us all. Like all political changes, UKRI was born out of compromise between many different stakeholders. This led to some inevitable fudges. Now would be an excellent time to resolve them and lay out a clear vision for the organisation.
John Womersley is director-general of the European Spallation Source and was chief executive of the Science and Technology Facilities Council between 2011 and 2016.
Print headline: Clear vision needed