榴莲视频

Australian research assessment ‘too expensive and opaque’

<榴莲视频 class="standfirst">Veteran administrator calls for Excellence in Research for Australia workload and frequency to be slashed
十月 20, 2020
Too expensive
Source: iStock

Australia’s research assessment exercise has cultivated excellence in styling rather than substance, according to analyst Frank Larkins, who says it should be stripped back to deliver credible results and bang for buck.

In a scathing??to the Australian Research Council (ARC), Professor Larkins questions whether the “diminishing benefits” from the Excellence in Research for Australia (ERA) exercise warrant the administrative effort and money lavished on it.

Professor Larkins says the ERA has largely achieved its goal of providing an “evaluation framework” to help research funders judge where to invest. But such benefits are overshadowed by the system’s enormous cost, questions over its robustness, suspicions of gaming, fears of unintended consequences and results that appear too good to be true.

“Universities have progressively refined their expertise in submission preparation…but has performance really improved that much?” the submission asks. “More professional administrative reporting structures have masked whether research improvements are real or imaginary.

“It is not appropriate to be promulgating results which imply substantial improvement in the research performance of Australian universities when the results may simply be a by-product of a methodological ‘artefact’.”

Professor Larkins, an honorary fellow and former deputy vice-chancellor of the University of Melbourne, offered the submission in his personal capacity. It is among scores tendered to the ARC’s??of ERA and its companion exercise, the Engagement and Impact Assessment (EI).

The paper argues that the astonishing improvement in results for research in science-based disciplines – and the lukewarm results for the humanities and social sciences (HASS) – are likely to reflect methodological flaws rather than the innate quality of the research.

Around 81 per cent of universities on average were ranked above world standard for research across 11 science-related disciplines in ERA’s 2018 exercise, up from 46 per cent in 2012. The average percentage of universities receiving a similar ranking for their HASS research rose far more modestly, from 28 to 36 per cent.

The submission says science-based disciplines are assessed primarily using citations. HASS disciplines have a stronger emphasis on peer review, where world standard benchmarks appear “more demanding” and less prone to change.

The sole “outlier” among the science-based disciplines – information and computing sciences, in which only 41 per cent of universities were assessed above world standard in 2018 – relies mainly on peer-review assessment, the paper notes.

Professor Larkins said it was impossible to judge the rigour of the ERA’s metrics-based assessment because the ARC would not reveal the benchmarks used to ascertain world standard. “The ARC is saying: ‘Trust us,’ and yet anomalies have emerged,” he told?Times Higher Education.

“There’s such a big disparity between the sciences and HASS and such a big change between 2012 and 2018. Are they real or spurious? We don’t know, and the ARC won’t provide us with the data. If universities are going to go to all this work [and] spend tens of millions of dollars to do the exercise, it should be subject to a higher degree of independent scrutiny.”

Professor Larkins insisted that some form of assessment was needed to generate accountability around the A$12 billion (?6.7 billion) spent annually on Australian university research and research training. But the exercise should rely on data already available – through sources like the Australian Bureau of Statistics, the Department of Education, university annual reports and the ARC itself – rather than forcing universities to collect still more data.

He said the two exercises should be rolled into one and conducted once every five years or more. And as in the UK, exercises should be postponed due to Covid-19.

Plans to conduct the??in 2023 and 2024 needed to be rethought, given the pandemic’s impacts on university staffing and finances and doubts about the “validity of the data” collected during the crisis. “In light of Covid, it should be pushed out at least a couple more years.”

The ARC said it had received more than 100 submissions about the assessment exercises and intended to publish them after the review was completed by June. But it said the pandemic had already delayed the process by six months and further hold-ups were possible.

john.ross@timeshighereducation.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
<榴莲视频 class="pane-title"> 相关大学
ADVERTISEMENT