ÁñÁ«ÊÓƵ

Prizes for enterprise: the shape of KEF to come

<ÁñÁ«ÊÓƵ class="standfirst">Introduced to help boost technology transfer amid renewed political focus on ¡®industrial strategy¡¯, the KEF aims to complement the REF and TEF. But how will it work? Is it even necessary? And is the UK really underperforming at commercialising its research? Rachael Pells reports
January 25, 2018
Pills in a factory
Source: Getty

¡°I¡¯m waiting for an ¡®admin excellence framework¡¯ ¨C that¡¯s what most academics spend their days doing,¡± says one Twitter user.

¡°When does the EFEF (excellence frameworks excellence framework) come along?¡± another asks.

All this calls for an IEF ¨C an ¡°idiocy excellence framework for ministers¡±, someone else suggests.

It is fair to say that the announcement of England¡¯s knowledge exchange framework last October was not universally welcomed. Only four months earlier, the first full results of the new teaching excellence framework had been released. Now, a sector used to fixating exclusively on the research excellence framework had a third major assessment exercise to grapple with.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Unveiling his latest regulatory brainchild at the Higher Education Funding Council for England¡¯s annual conference in London, Jo Johnson, then the UK¡¯s minister for universities and science, said that higher education institutions ¡°must do more¡± to strengthen their links with business and local industry. And there will be a cash incentive: the KEF will be used to determine allocations of England¡¯s Higher Education Innovation Funding (HEIF), which supports knowledge exchange and will be worth ?250 million a year by 2020-21.

There is a certain logic to the introduction of a formal assessment of universities¡¯ knowledge exchange activities: the third strand of universities¡¯ missions, alongside teaching and research. The KEF also has the political winds in its sails in an era in which ¡°industrial strategy¡± has quickly been transformed from an unpopular, old-fashioned concept to the government¡¯s magic formula to transform the UK¡¯s and spearhead the country¡¯s economic success post-Brexit. Hence it seems highly unlikely that Johnson¡¯s successor, Sam Gyimah, will reconsider the need for a KEF.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

But there are also reasons for scepticism about that. With the REF, TEF and all the various national and international league tables, universities and academics could be forgiven for suffering from assessment fatigue. Moreover, with the impact element of the REF, introduced under political pressure in 2014, already incentivising the communication and commercialisation of research, it is not immediately clear that further prods are necessary. And with few sensible, comprehensive metrics of knowledge exchange in existence, many question both the KEF¡¯s rationale and its feasibility.

But the exercise has actually been in gestation for some time. The UK¡¯s supposed under-performance in commercialising its highly successful research base has long been a bugbear of politicians and officials and, with this in mind, in early 2015 Hefce asked Keele University vice-chancellor Trevor McMillan to set up a group specifically to look into these issues.

The group¡¯s report was published in September 2016, under the title University KE Framework: Good Practice in Technology Transfer. It concluded that there is room for improvement, and that UK universities ¡°should be aspirational in our practice¡±. It made various recommendations, including clearer technology transfer strategies from university leaders, differential approaches to commercialisation in different sectors and the appointment of academic and professional staff ¡°who are entrepreneurial and who recognise the benefits of technology transfer¡±.

But its overall view was that the UK is not nearly as bad as previously imagined when it comes to commercialisation. ¡°All evidence suggests¡­that the UK university system is competitive in technology transfer. At the very least, the UK shares similar problems of technology transfer with other leading university systems round the globe,¡± it concluded.

Moreover, ¡°the UK should worry less about comparing itself with others, and do more to pursue its distinctive innovative approaches ¨C particularly in the developments of entrepreneurial ecosystems¡±.

Nevertheless, Johnson evidently felt that more needed to be done. Unveiling the ¨C Building a Britain fit for the future ¨C in November, he acknowledged that there is ¡°evidence¡± of successful existing university-industry collaboration, ¡°but the system as a whole needs to find a new gear¡± ¨C particularly because of the ¡°outsize role¡± that universities play in the UK¡¯s ¡°research and innovation system¡±.

And, in his October speech announcing the KEF, he cited the University of Queensland as a benchmark for success. The Australian institution¡¯s long-established tech-transfer subsidiary, Uniquest, helps to generate more than A$30 million (?17 million) a year from intellectual property: more, according to Johnson, than any Russell Group university. Stanford University and the Massachusetts Institute of Technology are also widely seen as examples of successful research commercialisation, and Johnson noted that UK universities require about ?5 million more in research spending for each spin-out company that they generate than US universities do. ¡°And US higher education institutions earn almost 40 per cent more IP licence income as a percentage of research resources than [do] those in the UK,¡± he added.

Scientist at a microscope
Source:?
iStock

Knowledge exchange is described in the as ¡°a process or other activity¡± by which knowledge relating to ¡°science, technology, humanities or new ideas¡± is exchanged and where ¡°the exchange contributes, or is likely to contribute, (whether directly or indirectly) to an economic or social benefit in the United Kingdom or elsewhere¡±.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Hefce ¨C whose duties regarding research and knowledge exchange will be taken over by the Research England strand of UK Research and Innovation in April ¨C employs ¡°essentially the same¡± definition, according to Hamish McAlpine, its senior higher education policy adviser. ¡°We would describe KE as shorthand for the myriad interactions between institutions and the wider world, for the benefit of the economy and society,¡± he says. ¡°In return, it seeks to bring the inspiration of that wider world back into universities and colleges.¡±

While its developers are keen to stress that the KEF should not be directly compared to the other assessment frameworks, aspects of it are evidently borrowed from the TEF, whose development Johnson also oversaw. This includes its focus on metrics rather than peer review, and its use of benchmarking to avoid comparing institutional apples with pears. In the case of the KEF, universities will be banded into groups for the purposes of comparison, taking into account regional economies and individual institutional characteristics that influence knowledge exchange performance.

The specific KEF metrics are to be designed by a technical advisory group, following the conclusion of a at the end of this month. In his November launching the process, Johnson said that the framework should be implemented in the autumn, and would ¡°create a constructive competitive dynamic, increase universities¡¯ responsiveness and accountability, and enable universities to benchmark and develop their own performance¡±.

He said that the advisory group should ¡°take into account¡± the work already done by the McMillan group, and ¡°the considerable amount of data already gathered¡±, including the annual , which has been collecting ¡°financial and output data related to knowledge exchange¡± since 1999. As well as information on research commercialisation, the survey ¡°also explores other activities intended to have direct societal benefits such as the provision of continuing professional development and continuing education courses, and the provision of, for example, lectures, exhibitions and other cultural activities¡±. Whether such non-commercial activities are intended to be assessed by the KEF is unclear at this stage. But Hefce is also directed to ¡°consider whether additional metrics can be devised and collected to provide a more comprehensive view of the effectiveness of universities¡¯ external engagement, whilst having regard to the burden and cost of collection¡±.

For McMillan, whose group will, , ¡°advise on the value of the KEF metrics exercise for good practice development within universities¡±, breadth is important. ¡°To consider just a small subset of activities, such as spin-out companies, will be misleading and provide potentially damaging incentives,¡± he says. ¡°It is also key that we give universities the forum to explain how they approach the [different] elements of knowledge exchange¡±, to avoid misunderstandings and misguided frustrations among ¡°external bodies¡±.

Robot production line

So what exactly should the KEF metrics be? Jonathan Grant, vice-president and vice-principal (service) at King¡¯s College London, who was heavily involved in assessing the success of the 2014 REF in a previous role at RAND Europe, broadly welcomes the idea of the KEF because ¡°it brings balance to the frameworks assessing universities¡±. However, ¡°it is essential that we don¡¯t fall into the trap of only using metrics, as they only measure the measurable¡±, he warns. ¡°This was a key finding of the early work on REF impact case studies and was borne out by the analysis of the case studies.¡±

The decision to focus on metrics also comes despite a recommendation not to do so from the McMillan group. One of the group¡¯s key conclusions was: ¡°Universities that do more research do more technology transfer. Beyond this, metrics are insufficiently sensitive to identify the right policies to achieve high performance.¡±

In the same year, intellectual property consultancy firm IP Pragmatics carried out, on behalf of Hefce, a into the use of knowledge exchange metrics to ¡°benchmark¡± universities¡¯ performance against each other. Its 14 suggested metrics included universities¡¯ knowledge exchange income per academic; value of consultancy engagements per academic; ratio of non-commercial to commercial income; continuing professional development income as a proportion of total teaching income; collaborative and contract research income, and total IP income, as a proportion of total research income; and staff or graduate start-ups surviving for three years as a proportion of all companies formed surviving for three years ¨C and the number of people they employ.

But the report notes that its suggestions are limited by ¡°the availability and quality of the underlying data¡± and that not all knowledge exchange activities ¨C especially those that are not revenue-generating ¨C ¡°are easily amenable to measurement¡±, even though they ¡°may be relevant¡±. It adds that some of the metrics are ¡°naturally volatile¡±: for instance, ¡°one large successful licence will have a very big influence on an HEI¡¯s relative performance in an IP income measure¡±.

It also warns that ¡°the success of KE activity is influenced much more by the underlying nature of the organisation than by the efficiency and effectiveness of the KE staff and processes within the organisation¡­The performance goals of a particular [institution] should be linked to their KE mission, and the indicators chosen [should] vary depending on these goals and the type of KE being examined. No single evaluation mechanism will be suitable for all contexts.¡±

Indeed, for Martin Willis, professor of English at Cardiff University, the knowledge exchange success of more humanities-focused subjects and institutions would be better assessed using REF-style case studies. ¡°Of course, that will be time-intensive and come at a great cost, but if we rely on metrics we are in danger of bias towards hard sciences and business faculties,¡± he says.

Others, though, advocate a more sophisticated metrics-based assessment akin to the ¡°Star Metrics¡± programme developed by the US National Science Foundation over the past decade (¡°Star¡± stands for ¡°Science and Technology for America¡¯s Reinvestment: Measuring the Effect of Research on Innovation, Competitiveness and Science¡±). The system¡¯s architect, Julia Lane, an economist now based at New York University, sees metrics as the only method of ¡°responsible, cost-effective¡± assessment.

¡°Case studies are a complete waste of time and money,¡± she tells Times Higher Education, pointing to the estimated ?250 million cost of developing the REF¡¯s 7,000 impact case studies. ¡°Surely the same goals of getting universities to think about what they¡¯re doing, providing them with a voice and providing anecdotes could be achieved with much less effort,¡± she says.

Lane is also involved in the programme (Universities: Measuring the Impacts of Research on Innovation, Competitiveness and Science), which aims to build on Star Metrics to construct a comprehensive data platform tracking the number of collaborations between federal agencies and research universities, as well as the flow of these collaborations into the economy. It does this by combining data on the labour, products and services purchased by the universities, the employment of researchers and the activities of the businesses involved. Participation in the programme is voluntary but, as of November, 62 institutions were involved, with that figure expected to grow to 150 by 2020, capturing 90 per cent of government spend on university research and development.

¡°You could use [this approach] to identify the interesting and striking outliers (at a cost, say, of ?2 million) and have the scientific community build a replicable, scientific analysis of [university knowledge transfer activity],¡± suggests Lane. ¡°Then you could pull out some anecdotes that are illustrative of a broader truth. That would seem to be an improvement over a deluge of unrepresentative and ad hoc stories that could, at worst, be seen as self-serving.¡±

Case studies would also pose a risk of duplicating the REF, whose next iteration, in 2021, will generally require one study for every 15 staff submitted. But the chair of the KEF¡¯s technical advisory group, Richard Jones, professor of physics at the University of Sheffield, is keenly aware of the risk of designing ¡°very bad metrics¡±. As he sees it, his group¡¯s role is ¡°to think very hard about what is appropriate and how to determine metrics that are actually related to what we are trying to do here.¡± He is also very conscious of the risk of the metrics being gamed: ¡°University professors are very creative individuals, who will always think up ways of making the most of any metrics, so it¡¯s about anticipating the response and making sure the outcome is what people want.¡±

When the KEF was first announced, critics were quick to voice fears that the framework would be, at its heart, a patent-counting exercise. However, Jones sees patent-counting as ¡°a classic example of a bad metric¡± because ¡°the number of patents you have is a function of how much money you¡¯ve got ¨C and how much money you¡¯ve spent on them,¡± he says. ¡°That¡¯s the sort of thing that is easily gamed [by] spending more money on it [but] that isn¡¯t helpful because it¡¯s not directly connected to what you¡¯re trying to [encourage] here ¨C which is creating value for the wider community. You have to always stop and think: How does this connect to value for the country and communities and regions?¡±

Hand holding a petri dish

The KEF brings good news and bad news,¡± says Siraj Shaikh, professor of systems security at Coventry University. As co-founder of successful spin-out cyber security company, CyberOwl, Shaikh has first-hand experience of the opportunities that successful commercialisation of IP can open up for a department such as his.

¡°The good news is it will help recognise knowledge transfer ¨C which we need to bring higher up the national agenda,¡± he says. ¡°The downside is that it will encourage the kind of ranking mindset that tends to [incentivise] management and leadership that is sometimes superficial, becoming so obsessed with metrics that the essence of [what is being assessed] is left out.¡±

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Shaikh also worries about the administrative burden that the KEF could impose, especially if managers implement the kinds of preparatory mock exercises common for the REF.

There are also those who question the whole rationale for the KEF, arguing that if the UK is indeed worse at scientific commercialisation than it is at discovery, the fault lies not with universities but with industry and the financial sector. One such sceptic is Donald Braben, an honorary professor in the earth sciences department at UCL, and a former civil servant.

¡°To put the blame for UK businesses¡¯ poor performance on universities is, to say the least, grossly unfair,¡± he says. ¡°UK industry has, for many years, invested much less in R&D than its competitors. This shortfall affects its priorities and range of interests.¡±

A much more effective industrial strategy, he believes, would encourage industry to ¡°increase its investment in R&D so that it would better understand [what] a proper relationship with universities [should be]¡±. That would see universities offering ¡°guidance¡± in ¡°deriving solutions¡± to industry¡¯s problems, but being otherwise left alone to do what they do best: blue-skies research.

¡°Had government adopted a similar policy to today¡¯s immediately after the war, they might have confined universities to looking for better thermionic valves, more efficient aircraft piston engines, or new ways of generating energy from coal. The results would have been disastrous. Instead, freedom was allowed to flourish, for a few decades at least, and the result included such benefits as MRI scanning, genetic fingerprinting and hosts of developments in molecular biology,¡± he says.

But for all the wariness about the KEF, some university leaders are positively excited about it. As an academic whose roots lie in industry, Jane Turner, pro vice-chancellor for enterprise and business engagement at Teesside University, went out of her way to approach Hefce before the current consultation had even formally opened.

¡°I¡¯ve been pushing this agenda in academia for 14 years so, for me, the announcement of the KEF was a breath of fresh air,¡± she says. For her, the exercise is a very good fit with Teesside¡¯s existing institutional priorities: ¡°It gives what we¡¯re already doing legitimacy and a profile. Often in organisations, you¡¯re pushing against a lot of cultural barriers, and knowledge transfer is not seen as a core activity. But, for me, it feeds into so many other elements of the student experience, of our responsibility as anchor institutions, and of our environmental impact.¡±

Still, she is worried that the KEF could become too ¡°internally facing¡± if it is designed primarily for and by academics: ¡°Any consultation should incorporate the views of business and industry, and what they want from university engagement¡±. And she doubts that the KEF will ¡°work for every university¡±.

In Shaikh¡¯s view, the KEF ¡°needn¡¯t be something to worry about¡± for the vast majority of the sector. ¡°Most ¨C not all, but most ¨C universities in the UK are already engaging with industry and will already have a story to tell,¡± he says.

Moreover, he thinks that the rationale for the KEF is ultimately unanswerable. ¡°I don¡¯t think it¡¯s unreasonable to ask universities to improve their game [on knowledge exchange] because institutions in this country benefit from public money,¡± he says. ¡°And it¡¯s clearer now ¨C thinking beyond Brexit ¨C that we need to acknowledge what their return is on that.¡±


Library
Source:?
iStock
<ÁñÁ«ÊÓƵ class="p3">Kiwi fruit: how New Zealand assesses knowledge exchange

Universities in New Zealand are awarded extra government funding according to the amount of funding that they raise from firms and not-for-profit organisations ¨C although this is ¡°narrower than what is planned under the KEF¡±, according to Roger Smyth, the recently retired head of tertiary education policy at the country¡¯s Ministry of Education.

Institutions¡¯ ability to attract external funding accounts for 20 per cent of research funding allocated via the country¡¯s core public funding programme, the Performance-Based Research Fund. Quality, assessed via REF-style expert panels, accounts for 55 per cent of funding, and contribution to postgraduate education and research training accounts for 25 per cent of the funding, calculated on the basis of the number of research degree completions.

In the funding formula, income from non-government sources within New Zealand, including industry and the not-for-profit sector, has double the weighting of income from government sources. According to Smyth, this formula was established in 2013-14 to incentivise the pursuit of such funding sources: ¡°in effect, to perform research that is done at the request of firms and that meets the knowledge needs of the funder¡±.

This came in recognition of the increasing proportional reliance of New Zealand¡¯s eight universities on government sources of research income, according to Smyth. It also reflected a wish to ensure that non-government organisations got more benefit from New Zealand¡¯s investment in research capability.

¡°Another benefit of this change is increased transparency,¡± he adds; previously, ¡°the universities would only provide this information on an aggregated basis, rather than by institution¡±.

All eight universities now operate research commercialisation arms. The largest of these, Auckland UniServices, a subsidiary of the University of Auckland, is New Zealand¡¯s largest IP firm, and its activities in commercialisation of IP and commercial research and consultancy contracts generated revenues of NZ$100 million (?53 million) in the 2016 financial year, according to a spokeswoman.

Rachael Pells


Working on a circuit board
Source:?
iStock
<ÁñÁ«ÊÓƵ class="p3">Precision engineering: how should the KEF be designed?

According to the Industrial Strategy White Paper the KEF will ¡°sit alongside¡± the TEF and REF to form a ¡°holistic¡± assessment of universities¡¯ ¡°threefold mission¡± ¨C generating knowledge (research), transmitting knowledge (teaching), and translating knowledge (knowledge exchange).

Throughout history there has been much intellectual disagreement about the purpose or ¡°idea¡± of a university, so we might wonder whether government really can simply stipulate universities¡¯ ¡°mission¡± in this way. We could have fun identifying some of the oldest universities ¨C Bologna or Oxford perhaps ¨C which, for most of their history, have not fitted this model. Perhaps Oxford was not a university after all (a polytechnic?); maybe Bologna just failed in its ¡°mission¡±?

While this kind of nit-picking demand for precision is a core part of what academics do, there are more pressing practical issues. Will the KEF ever ¡°sit alongside¡± the REF and TEF? Will a university¡¯s KEF rating (presuming this is how the KEF works) ever affect its reputation as much as a TEF or REF rating?

As recent history has reminded us, predicting the future is a fool¡¯s game, so it is wiser to focus on a different question. Should any university value its knowledge exchange activities on a par with its research and teaching?

Distinguishing explicitly between an elite and a mass higher education system is helpful here. The UK has a mass higher education system but if its history were a 24-hour clock then, until about 23.50, the system was solidly elitist. This matters. First, because many criticisms of universities only make sense on the assumption that we have an elite system. While it might indeed be odd for an elite system to offer ¡°golf studies¡±, it is not remotely anomalous in a mass higher education system.

Second, society¡¯s expectations of an elite university system will be very different from its expectations of a mass ¨C and much more expensive ¨C system. It is worth labouring this point. Some in the university sector convey the impression that the government¡¯s choice is between funding universities and burning money. If that were true, not funding universities would be barbarism. But this is not the choice. There are many worthy competing calls on the public coffers. Hence, it is reasonable for the public to expect universities to value knowledge exchange on a par with teaching and research.

We should also bear in mind the disproportionate role that UK universities play in the country¡¯s R&D, accounting for 26 per cent of the total, compared with 14 per cent in the US, 17 per cent in Germany and 13 per cent in Japan. So it is a social necessity for UK universities to do a significant amount of heavy lifting on knowledge exchange.

None of this means that all universities should aim to succeed at all three excellence frameworks, but two out of three seems reasonable, at least for larger institutions. The question for senior management up and down the country over the next few years will be, which two?

But it is not the case that just any old KEF will do. If KEF metrics correlate with REF metrics, why bother with an extra exercise? Equally, if KEF metrics simply overlap or complement REF impact, it would be much better to beef up the REF rather than introduce a new framework.

KEF metrics must also be comprehensive. The exercise must measure more than universities¡¯ business interactions. It must capture the widest possible range of knowledge exchange activities ¨C social and cultural, as well as economic.

Most importantly, the KEF should not be conceived as a UK-centric activity. The influence of British universities on knowledge exchange is global, so the KEF should be understood and measured globally. If global knowledge exchange is not measured, less of it will occur; in a post-Brexit outward-facing ¡°global Britain¡±, this would be disastrous.

Admittedly, it is not obvious how to achieve all this. We can hope that the consultation will address many issues, but I would also recommend a KEF pilot exercise ¨C much like the subject-level TEF pilot, in which my institution is participating. We need to get the KEF right, so we should proceed cautiously.

But if the KEF is designed properly, the sector should embrace it. A world of the TEF, REF and KEF is an infinite improvement on the previous, unipolar world of the REF, and all the distortions in institutions¡¯ and individuals¡¯ priorities that it created.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Graham Galbraith is vice-chancellor of the University of Portsmouth.

<ÁñÁ«ÊÓƵ class="pane-title"> POSTSCRIPT:

Print headline:?The shape of KEF to come

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (1)
In analysing knowledge transfer, surely it's more important to identify what works and why, then promote (and fund) the spread of such activity across ALL universities than just come out with yet another meaningless set of league tables. We need to learn from the idiocy that goes on in schools instead of joining in. Time to reject TEF, REF and the NSS and concentrate on spreading good practice.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs
ADVERTISEMENT