Source: Getty
Tuition fees are the “major source” of funding for research at the London School of Economics because the institution concentrates on subjects that get relatively less money from government, the LSE’s former interim director has said.
Dame Judith Rees said the LSE was having to rely “more and more” on tuition income to fund research, which caused a “dilemma” because students were increasingly scrutinising what they got in return for higher fees.
Speaking last week at the annual conference of the Association of Research Managers and Administrators, Dame Judith, co-director of the Grantham Research Institute at the LSE, estimated that about 40 per cent of social science and arts research in universities was effectively “unfunded” and subsidised by teaching income.
“Certainly in my institution, student fee cross-subsidies are the major source of research funding,” she said, adding that the fact is not usually acknowledged.
She cited LSE research that indicated that more than 70 per cent of research council and quality-related funding goes into science, technology, engineering and mathematics subjects. “We are relying more and more on student fees [to fund research] and this might be a bit of a dilemma because students are now customers and so they are much more conscious about value for money,” she suggested.
Dame Judith, who was interim director of the LSE from May 2011 to September 2012, also claimed that bonuses were sometimes being paid to academics in the UK if they got published in certain journals. She said she had heard of one example where an unnamed university department had paid ?10,000 in bonuses.
She believed such payments were a symptom of the publish or perish trend currently rife in academia, although she put this down to the “academic mindset” rather than government policy or university bureaucracy. Blaming assessments such as the research excellence framework for such problems was a “gross oversimplification” as similar trends were occurring in other areas of the world where the REF does not exist, she explained.
“When colleagues moan about the REF and what is being imposed on them, I tell them that it is not the government or bureaucracy that is telling them what good research is…or [those] sitting on the appointment panels, [or] who referee journals and do the refereeing process for the research councils. It is academics,” she asserted.
Dame Judith did not deny that the REF has “reinforced these market trends” and argued that there has been “tremendous pressure to conform”.
Academics are under pressure to “become experts in a relatively narrow area”, she added. The trend is born out of the idea that by being specialised an academic could rise to the top, and that applied or multidisciplinary work was less favourable, she explained.
“I don’t deny that the early research assessment exercise panels did exacerbate that problem, but to my knowledge they didn’t start the process,” she said.
Dame Judith also said that promotions and pay rises were “still heavily biased towards research” despite the pressure on research-intensive universities to “up their teaching game”.
“In my experience no one has ever been headhunted for their teaching grants,” she said.
A trend for performance-related pay may exacerbate this problem further unless equal weight is given to administration and teaching ability, she said.
Time-consuming, inflexible and insensitive: administrators’ view of the REF
Future editions of the research excellence framework should have a better system to recognise cross-disciplinary work and broaden the definition of impact, according to a group of research administrators and managers.
The group of about 40 university staff came together at the annual conference of the Association of Research Managers and Administrators in Blackpool last week to discuss their experiences of the REF.
Top of the administrators’ gripes was dealing with impact – the newly introduced requirement to document the wider benefits of research. The group generally agreed that providing impact case studies took about a third of the total effort required for the entire assessment, with some saying it counted for more.
The fact that there was just one template for impact case studies caused difficulties for others, who said it should be made more flexible in future to allow for potential differences in impact between disciplines. Not knowing the benchmark for a good impact case study also caused headaches. As did the fact that the definition of impact used by funding councils meant that some research could not be included for assessment.
A major problem for some administrators was dealing with academics that had extenuating circumstances. REF 2014 required institutions to provide detailed assessments of academics with special circumstances that may affect submission, such as long-term illness or maternity leave.
Delegates said that the requirement to gather specific evidence in this area lacked sensitivity, and some institutions did not have records of academic attendance that could be used. Those present suggested that the remit of the exercise was expanding, with the introduction of impact in REF 2014 and new open access requirements for future assessments. Some others, meanwhile, questioned whether the REF was trying to influence behaviour rather than assess it.