The review of Australia’s research assessment exercise was a “missed” opportunity that generated an “insurmountable” workload ahead of the next round in 2023, according to an influential higher education analyst.
A health check of Excellence in Research for Australia (ERA) and its companion Engagement and Impact (EI) assessment made 22 recommendations to improve and streamline both exercises, while endorsing the “fundamentals” of the two frameworks.
But Frank Larkins said the reviewers spent too long identifying problems and too little time unpicking solutions, flick-passing the technical work to another expert group that had been left insufficient time to complete it.
The obstacles confronting this group “may be insurmountable”, Professor Larkins, a former deputy vice-chancellor of the University of Melbourne. “The major challenge will be to establish revised methodologies acceptable to the community within a very short time frame.” ?
Universities will be left grappling with new ERA guidelines just as Covid-19 takes full toll on their staffing levels, he added.
The Australian Research Council (ARC), which administers both exercises, dismissed the criticisms. “The advisory committee’s purpose was to take a high-level view of the strengths of ERA and EI and identify opportunities for improvement, not to specify all of the technical details of implementation,” a spokeswoman said.
She said the ARC was “working hard” to implement the committee’s recommendations, which had been informed by expert advice and “extensive” consultation. “The ARC has convened a working group…to address some of these details and is consulting with universities to work through others.”
The ARC says new ERA and EI rating scales will be published in the first half of 2022, in time for ERA in 2023 and EI in 2024. Professor Larkins demanded a “longer lead time” given that the assessment outcomes did not influence university funding.
“The ERA and EI exercises, if progressed, should be deferred until at least 2025 and 2026 respectively to give universities time to establish new ‘Covid normal’ operational frameworks,” he said.
But it would be preferable to jettison EI entirely, he added, with overseas attempts to measure research impact proving “fraught with many difficulties”. There were no metrics capable of doing so in a “one size fits all approach…that adequately captures the broad range of research impacts.
“The expense and time commitments associated with the EI exercise cannot be justified given questionable benefits and the limitations associated with defining and assessing the metrics.”
The ARC stood by EI and said the continuation of both assessments had been a government decision, with the timetable approved by the education minister.
It said the pandemic’s impacts had been a “key consideration” of the review. “To ease the immediate burden for universities, many of the changes…will take place incrementally over future rounds.”
Professor Larkins said ERA should be separated into two distinct categories – one for STEM and another for the humanities, arts and social sciences – with different rating scales and benchmark metrics for each. And he said ERA performance benchmarks should be based on selective rather than global comparisons.
He said some ERA results had stretched credibility, citing the 2018 finding that mathematical research at 93 per cent of assessed Australian universities was above world standard. “It would be more realistic to establish publication and citation benchmarks for science disciplines against a small group of English-speaking developed countries.”