Fresh questions have been raised over Australia’s research assessment exercise, with a former University of Melbourne leader claiming that a supposed escalation in local research quality may in fact reflect a decline in global standards.
In a new , Frank Larkins, an ex-Melbourne deputy vice-chancellor, expresses scepticism about the extraordinary improvements tracked by successive Excellence for Research in Australia (ERA) assessments, with the proportion of “above world standard” research in some disciplines almost doubling over six years.
Professor Larkins the Australian Research Council (ARC), which administers ERA, for not releasing more information about the benchmarks it uses to judge world standard. He says the boom in Australian research quality has only occurred in science-related fields, which are largely assessed using citations.
By contrast, humanities and social sciences disciplines – which are gauged using “relatively stable peer review assessment processes” – have shown modest improvement.
“The different assessment methodologies provide a basis to question the integrity of the excellence findings,” Professor Larkins writes. “Are there fundamental flaws in the world standard benchmarks used in the different approaches?”
Professor Larkins said that increased publishing by academics in the developing world was likely to have exerted downward pressure on global world standards for some disciplines, as measured by average citation rates – a problem long recognised but “apparently not addressed” by the ARC.
The ARC said that it had developed its indicators through extensive consultation with the research community. “All ratings for ERA are the result of expert judgement of committees of distinguished researchers,” it said.
“They interrogate the data…in the context of [the] discipline and make judgements that are broadly comparable across disciplines.”
It stressed that ERA ratings were not based on benchmarks alone, saying a “citation profile” was created for each discipline at each institution. “[This] allows the expert committees to examine the data in different ways,” the council said.
Professor Larkins’ analysis found that in science-related disciplines, 43 per cent of the assessed “units of evaluation” – research outputs produced by each university in each broad discipline – had been rated above world standard in 2012. By 2018, this proportion had shot up to 80 per cent.
In humanities fields, over the same period, the proportion had risen by eight percentage points to 35 per cent.
Professor Larkins said discrepancies in standards could have “very serious” consequences. “The three ERA rounds have demonstrated very clearly that if a university aspires to increase its research standing, investment in science-related disciplines is more likely to provide a dividend than investment in the humanities,” he said.
The ARC said that it would review ERA, along with its engagement and impact assessment, “to ensure that they continue to be examples of world’s best practice and have the ongoing support of the university sector”.