ÁñÁ«ÊÓƵ

Researchers ¡®unaware¡¯ of extent of assessment influence

<ÁñÁ«ÊÓƵ class="standfirst">Danish-Australian paper finds researchers unconsciously adjust to accommodate assessment exercise
February 14, 2020
Source: Getty

Academics are unaware that research assessment fundamentally shapes their work practices, even though they consciously endeavour to turn assessment exercises to their strategic advantage.

A study has uncovered ambiguous attitudes to research assessment exercises, with some academics ¨C particularly those in precarious employment ¨C intensely aware of them, while others are all but oblivious.

Many researchers treat research assessment as a game of ¡°hunting for points¡± and make minimal efforts to ¡°comply on the side¡± without allowing the assessment process to derail their research focus, the study found.

Despite this, the process has a dominating influence not only on where and what researchers publish but also the nature of their collaborations, fieldwork, peer reviewing and journal editing and even the types of research they consider possible, found the paper,?in the journal?Studies in Higher Education,?

ÁñÁ«ÊÓƵ

The study investigated Danish academics¡¯ attitudes to their country¡¯s national research assessment exercise, the Bibliometric Research Indicator (BFI). It found that the BFI guided considerations well beyond which journals researchers should target.

¡°They felt under pressure to produce publications quickly in high-quality international journals,¡± said Julie Rowlands, an associate professor in education leadership at Deakin University, who co-authored the paper with Susan Wright from Aarhus University in Denmark. ¡°That influenced the types of projects they chose, how they pursued them, how they framed them ¨C fundamental decisions about the nature of the research, not just what they published out of it.

ÁñÁ«ÊÓƵ

¡°And they weren¡¯t aware of that until we pointed it out, based on some of the responses they had made.¡±

The research cites the example of a postdoctoral researcher who insisted that the BFI did not affect how she approached her research.

Later in the same interview she related how she had sent PhD students the ¡°BFI lists¡± of journals that attracted assessment points, and encouraged them to target the high-scoring publications. When reminded that she had earlier claimed that the BFI did not affect her work, she conceded that it exerted more influence than she had realised. ¡°It kind of infiltrates your brain,¡± she said.

Dr Rowlands speculated that academics regarded themselves as inhabiting ¡°parallel universes¡±. One revolved around disciplines while the other involved employment obligations to universities.

ÁñÁ«ÊÓƵ

¡°[They] recognised that they were in those two spaces, but they didn¡¯t see them as connected,¡± she said. ¡°We found that what happened in one universe did affect the other.¡±

The research involved interviews with academics in the natural sciences and humanities departments of an unspecified Danish university. Dr Rowlands said the paper was based on ¡°a small case study¡± and made no claims over whether other countries¡¯ assessment exercises would influence researchers in similar ways.

But prior research suggested that they probably did, particularly in nations with metrics-based assessment systems such as Australia and New Zealand.

Dr Rowlands added that the UK¡¯s research excellence framework was unlikely to have such a marked influence because it was not a bibliometric system. Assessments of UK researchers were based on judgements of their work by panels of disciplinary experts, rather than rankings of the journals in which they had published.

ÁñÁ«ÊÓƵ

¡°But the academics are still being measured, monitored and reported,¡± she noted, adding that ¡°the very act of measuring and monitoring¡± influenced people¡¯s conceptions of merit. ¡°There is still very much the potential for [this] process to affect the nature of research practice,¡± she said.

john.ross@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Related universities
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (1)
"Dr Rowlands added that the UK¡¯s research excellence framework was unlikely to have such a marked influence because it was not a bibliometric system. Assessments of UK researchers were based on judgements of their work by panels of disciplinary experts, rather than rankings of the journals in which they had published." This may be true for some disciplines or what it says in official REF guidelines. Yet, the ugly truth and practice on the ground in the UK paints a different picture. For example, journal lists are frequently used in business schools or economics departments (the infamous ABS list or FT list come to mind) in conjunction with and for internal REF assessments and promotion/permanency decisions. Others use bibliometric data such as impact factors or citation rates as proxy for (internal) REF assessments. First lesson about the UK, there are official guidelines and policy documents and then there are custom and practice. The latter usual trumps the former.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs