With its methodology all but perfected, Australia¡¯s massive research assessment exercise now faces calls for its dismantling.
The elite Group of Eight university network is leading a push for the Excellence in Research for Australia initiative to be either wound back or hardwired into funding arrangements. Chief executive Vicki Thomson said that in its current form, the programme was little more than a cripplingly expensive quality assurance tool.
¡°ERA requires extraordinary administrative and academic effort, on the part of both universities and the Australian Research Council, for no monetary return,¡± she said. ¡°The rationale for doing ERA every three years is questionable. After four processes it is time to review [its] utility.¡±
ERA¡¯s latest round took place last year, with the results ¨C released on 27 March ¨C suggesting substantial improvement in the volume and quality of Australian research. The outcomes of a companion exercise, the pilot Engagement and Impact Assessment, were revealed two days later.
ÁñÁ«ÊÓƵ
While the Labor opposition is critical of the engagement and impact programme, political support for ERA appears bipartisan. Education minister Dan Tehan said the programme proved that government research spending delivered value to that taxpayers.
Labor has promised a ¡°root and branch¡± review of research if it wins the federal election, which is expected in May. Nevertheless, shadow research minister Kim Carr said that he remained committed to ERA.
ÁñÁ«ÊÓƵ
¡°It shapes behaviour and gives you a proper assessment of universities¡¯ capacity,¡± Mr Carr told?Times Higher Education.
He said Australia¡¯s prosperity and social equality hinged on its capacity to develop new technologies. ¡°We¡¯ve got to invest more in R&D. In so doing, we¡¯ve got to be able to demonstrate to the public that the money is well spent,¡± he said.
But Mr Carr, who introduced ERA a decade ago when he was research minister, said?that the exercise could be used to drive resource allocation if Labor won the election. ¡°I linked it to funding, and if given a choice I would do it again,¡± he told university administrators in Canberra.
The UK¡¯s research excellence framework is used to guide the allocation of about ?2 billion a year of university research funding. ERA, by contrast, has had no influence on funding since 2016, when the Sustainable Research Excellence programme ¨C which distributed money partly on the basis of ERA results ¨C was scrapped.
Even under that arrangement, ERA determined the allocation of just A$366 million (?200 million) over five years, Ms Thomson said. Yet the assessment costs the ARC, which runs the exercise, between A$3 million and A$5 million in annual administrative costs while Go8 member institutions expend another A$8 million per round.
Kent Anderson, a strategic adviser at the University of Newcastle, said that ERA was a ¡°colossal¡± waste of resources for both taxpayers and universities. He said that a new government should muster the courage to discard it ¡°or put some real impact behind it¡±.
Professor Anderson said that the exercise would ¡°at least make sense¡± if it informed funding allocations from the Research Support Program ¨C which helps cover indirect costs of research ¨C or the Research Training Program, which provides scholarships for research students.
¡°In the RSP, it makes sense to incentivise quality research by rewarding excellence, as the name of ERA would suggest,¡± Professor Anderson said. ¡°As for the RTP, it is amazing that we let universities supervise PhD students when ERA results show that some are not doing enough research to be assessed, or are doing research well below world standard.¡±
ÁñÁ«ÊÓƵ
Other suggestions for ERA reform include conducting it only every five or six years, which would bring it closer to the seven-year frequency of the REF. Another idea is to move away from point-in-time stock-takes of research effort and instead strive for continuous assessment, using improved data collection capabilities and researcher identification systems?such as ORCID.
Most ERA critics question its rationale rather than its rigour. Rooted in metrics, and measuring quality across 157 disciplines ¨C compared with 34 in the REF¡¯s current outing ¨C its methodology was largely bedded down in 2015.
The 2018 round required only minor changes such as discarding esteem measures and removing references to the ¡°commercial sensitivity¡± of research publications ¨C a logical step, given that ERA rules require submitted articles to be open access.
Frank Larkins, former deputy vice-chancellor of the University of Melbourne, said that, while ERA may be rigorous, it had ¡°served its purpose¡±. He said that the 2012 and 2015 rounds had stimulated change in universities¡¯ research strategies, but options for further reform were limited.
ÁñÁ«ÊÓƵ
He said that some of the changes triggered by ERA had been counterproductive, with the heightened focus on research arguably occurring at the expense of teaching. The scheme had encouraged a tendency to employ academics in teaching-only roles, and skewed PhD enrolments towards foreign candidates.
Professor Larkins has released recent papers outlining the dangers of both trends. He said that their impacts were evident in the latest ERA report, which shows that the increasing volume and quality of research has occurred against a backdrop of waning human resources.
In full-time equivalent terms, the numbers of researchers declined 7 per cent between the 2015 and 2018 assessments ¨C even though the headcount rose by 13 per cent. Professor Larkins said that this reflected an increased reliance on contract staff and a shift of research-active academics into teaching-only roles.
His recent?analysis?of staff numbers showed that the tally of teaching-only academics had roughly doubled since 2013, while the number of research-active staff had fallen.
Despite this decline, ERA charted a steady increase in research output. Universities submitted 17 per cent more journal articles and other research outputs than they had for the 2015 assessment.
The number of ¡°units of evaluation¡± ¨C bodies of research material in each specific discipline submitted by each university ¨C increased by 6 per cent.
This boost in research productivity was particularly pronounced in science, technology, engineering and mathematics subjects, most of which recorded double-figure increases. In chemical sciences, units of evaluation increased by 18 per cent and research outputs by 20 per cent despite a 14.2 per cent decline in the research workforce.
In agricultural and veterinary sciences, where staffing numbers plunged by 15 per cent, units of evaluation and outputs both increased by the same margin.
But the productivity increases tended to be more modest in the humanities and social sciences. Two fields ¨C ¡°commerce, management, tourism and services¡±, and ¡°language, communication and culture¡± ¨C recorded falls in both units of evaluation and research outputs, even though their staffing declines were less pronounced than in many STEM fields.
Professor Larkins said that the shift to international PhD students ¨C who accounted for almost?four out of five?additional doctoral students recruited over the past decade ¨C was a key factor behind these figures. Foreigners overwhelmingly preferred STEM subjects, and ¨C as full-time students ¨C delivered more research papers per year than their domestic peers, who were largely part-time.
Professor Larkins said that international PhD students¡¯ appeal to universities was understandable. But the recruitment trend meant that Australia risked short-changing itself of doctoral graduates in key strategic disciplines.
¡°If numbers count, the best investment a university can make is to have an international student,¡± he said. ¡°But the best investment a university can make might not be in the best interests of the country.¡±
john.ross@timeshighereducation.com
ERA 2018: research output up, staffing levels down
Broad field | Growth in breadth (%) | Growth in output (%) | Change in staffing?(%) |
Mathematical sciences | 8.3 | 10.9 | 0.4 |
Physical sciences | 13.6 | 15 | -9.5 |
Chemical sciences | 18.1 | 19.9 | -14.2 |
Earth sciences | 1.6 | 20.2 | -13.2 |
Environmental sciences | 11.3 | 28 | -5.7 |
Biological sciences | 7.2 | 19.9 | -7.9 |
Agricultural and veterinary sciences | 14.3 | 14.5 | -14.8 |
Information and computing sciences | 7.5 | 14 | -4.7 |
Engineering | 12.8 | 25.6 | 0.1 |
Technology | 26.3 | -12.2 | -23.1 |
Medical and health sciences | 14.3 | 28.1 | -4.1 |
Built environment and design | 4.5 | 19.7 | -3.6 |
Education | 0 | 7.9 | -13.6 |
Economics | 0 | 5.1 | -2.7 |
Commerce, management, tourism and services | -8.9 | -1.7 | -6.7 |
Studies in human society | 7.4 | 8.3 | -7.7 |
Psychology and cognitive sciences | 20.5 | 21.9 | 6.8 |
Law and legal studies | 10 | 14.2 | -5.7 |
Studies in creative arts and writing | -15.7 | -4.9 | -12.9 |
Language, communication and culture | 0 | 3 | -13.7 |
History and archaeology | -2.3 | 5.7 | -8.6 |
Philosophy and religious studies | -4.7 | 4.7 | -8.1 |
Source: ERA 2018, Australian Research Council. Figures reflect changes between ERA¡¯s 2015 and 2018 reporting periods. Breadth is the number of units of evaluation assessed at the specific discipline level. Output is the number of publications and other research outputs submitted, with weighted output used in fields in which peer review was conducted. Staffing is the number of researchers on a full-time equivalent basis.
ÁñÁ«ÊÓƵ
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±á·¡¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login