ÁñÁ«ÊÓƵ

Most European campuses ¡®use journal impact factor to judge staff¡¯

<ÁñÁ«ÊÓƵ class="standfirst">Preliminary results of EUA survey suggest three-quarters of responding institutions draw on much-criticised metric
October 3, 2019
horse jumping
Source: Alamy

European universities continue to rely heavily on publication metrics ¨C in particular, the much-criticised journal impact factor ¨C when assessing academic performance, a study suggests.

However, of a survey of about 200 institutions by the European University Association indicate that one of the main obstacles to reforming research assessment is resistance to change from academics themselves.

The survey, to be published in full later this month, says that research publications and attracting external research funding were rated most highly when universities were asked which type of work mattered most for academic careers, selected as being important or very important by 90 per cent and 81 per cent of respondents respectively.

Asked how academic work was evaluated for career progression decisions, publication and citation metrics ranked top, selected as important or very important by 82 per cent of respondents.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

And, despite criticism of journal impact factor ¨C a citation-based evaluation of the periodical in which an academic¡¯s work is published, not an assessment of the impact of the paper itself ¨C three-quarters of institutions said that they used it to evaluate staff performance, more than any other metric. Academics argue that journal impact factor is an unfair metric and can be open to manipulation.

Seventy per cent of respondents to the EUA survey said that they used academics¡¯ h-index, a measure of productivity and citation impact, as part of their assessments.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Bregt Saenen, the EUA¡¯s policy and project officer, said that the widespread use of journal impact factors was ¡°one of the most disappointing results from the survey¡±.

¡°The quality of a journal article should be assessed based on the merit of the research/article itself, not on the reputation of the journal in which the article is published,¡± he said.

Dr Saenen said that universities needed to move towards ¡°a less limited set of evaluation practices to assess a wider range of academic work¡±.

With this in mind, Dr Saenen said that it was ¡°encouraging¡± to see other areas being regarded as important or very important by respondents, such as research impact and knowledge transfer (68 per cent), supervision (63 per cent) and teaching (62 per cent).

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Seventy-four per cent of respondents said that qualitative peer review assessment was an important or very important factor in career progression decisions.

However, asked what the main barriers to reviewing research assessment were, 33 per cent cited resistance to reform from academics themselves. This was one of the most popular responses, alongside the complexity of reform (46 per cent), lack of institutional capacity (38 per cent) and concern over increased costs (33 per cent).

Dr Saenen said that key barriers for universities were likely to be ¡°accountability to research funding organisations and governments in their approach to research assessment, as well as the influence of the competitive environment in research and innovation¡±.

Reviewing research assessment procedures ¡°is a shared responsibility and will require a concerted approach¡±, he said.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

anna.mckie@timeshighereducation.com

<ÁñÁ«ÊÓƵ class="pane-title"> POSTSCRIPT:

Print headline: Citation metrics reign supreme

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (4)
One needs to examine the publications more carefully, instead of accepting them at face value. At Curtin University in Western Australia the Head of School in Management published his own journal to ensure that selected students would rack up a sufficient number of publications so that they would be promoted. None of this was properly "peer reviewed" and the number of journals printed for each edition was less than 10. None of these publications were ever sold, but given to the mates of the Head of School. There was a big issue with plagiarism during this Head of School's tenure, and bribery between students and staff was investigated. Be careful what you wish for!
Even for papers published in highly ranked journals, the mode for the number of citations for most published papers in those journals is '0' or '1'. So if the impact of each paper is scrutinised for its impact, the overall impact will drop. This indicates that there is an overall inflation of impact via looking at journal impact factor. This partly explains why researchers are resistant to other more holistic indices.
It is truly pathetic when only just over 60% of universities say that supervision and teaching are important measures of academic success. Have they completely forgotten what "the academy" is all about? If they only care about research, they should be research institutions, not universities.
It is true, publication rate, grant income and citation rate are very imperfect measures of academic productivity. So, the solution is a simple one ...... suggest alternative and better (more objective and more quantitative) measures to use instead. Or, could it be that quantification of performance in academia a flawed idea in itself?
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs
ADVERTISEMENT