ÁñÁ«ÊÓƵ

University rankings are thriving: just look at the data

<ÁñÁ«ÊÓƵ class="standfirst">Reports of the demise of university league tables are greatly exaggerated. Our rankings are growing in size and influence, writes David Watkins
February 15, 2024
peacock
Source: iStock

Times Higher Education is at its core a data company.

The rankings?we publish are based on data collected from universities, research papers and academics. The products and services we provide to universities and governments to help them achieve their goals use data as a backbone.

For that reason, it¡¯s only right we judge the success of our rankings based on data, just as any respectable discussion on university rankings should. As the American scholar W. Edwards Deming reportedly said: ¡°In God we trust. All others must bring data.¡±

So, what does the data say?

Increasing participation

Data trends paint a compelling picture of our growing influence. Take, for example, our Impact Rankings, which evaluates universities¡¯ progress towards the United Nations¡¯ Sustainable Development Goals. Its debut in 2019 saw 556 universities ranked; this year, 2,152 institutions have submitted data for the 2024 ranking, scheduled for release in June. This staggering 287 per cent increase underscores not only the escalating commitment of universities worldwide to sustainability, but also the rigorous assessment of it.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

We also have plenty of feedback from university leaders confirming this trend. And we know that we not only measure universities but help drive them to improve; our Impact Rankings have helped universities focus more on the sustainability agenda.

Participation in our World University Rankings, which uses a broad set of metrics for research-focussed universities, has more than doubled over the past six editions, climbing from 1,258 universities in 2019 to 2,673 in 2024.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Academic endorsement

But what about academics themselves? Every year, we run an invitation-only global reputation survey asking cited academics who have published in the past five years for their views on the best universities in their area of specialism. The responses are used to create our World Reputation Rankings, and they also feed into our World University Rankings and all WUR subsidiary rankings.

It seems academics approve of our approach; the number of responses to the survey has increased?more than?fivefold since 2021.

Growth in quality data points

The surge in participation has a beneficial by-product ¨C a dramatic increase in high-quality data about global higher education. Over the past four years, the volume of data points fuelling our deep analytics, offered to universities and governments, has expanded eightfold. (This doesn¡¯t include the more than 100 million citations we have access to via bibliometric data provided by our partner Elsevier).

From all the data we have available to us, it¡¯s clear that our university rankings are proving increasingly popular and important across the world, and that respect for them among academics is also increasing. Of course, we can¡¯t and won¡¯t rest on our laurels; we know that rankings aren¡¯t perfect and, as responsible rankers, we are committed to a journey of improvement as demanded by higher education leaders and ranking advisory boards. For instance, this past year saw us strengthen our World University Rankings methodology, which has been met with positive feedback from across the sector.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

To paraphrase Mark Twain, the reports of university rankings¡¯ demise are greatly exaggerated.

David Watkins is managing director of data science at Times Higher Education.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (3)
One could have also gone for a more self-critical look at THE's own role in driving increasing expenditure away from higher education and the student experience, towards armies of consultants, bean counters and accountants to get all that 'high quality data' (barf) in order. All that money could have been invested in good education of our future generations, but now it clearly is not. Outstanding data scientists would not fall in love with the data that they have, but probe a bit further, and ask universities directly whether they think the current ranking system is fair, or whether they actually feel free to pull out of your wonderful rankings, without massive negative consequences to their bottom line?
I would not be proud of creating division in higher education and taking away resources from education and research to spend on administrators trying to spin data, trying to increase `student satisfaction` at all cost. We know that student satisfaction has nothing to do with learning, it usually means how easy it is to get a first. We all know that publication counts and citation numbers have nothing to do with quality of research. Grant income has nothing to do with quality of output either. Once a metric becomes a target, it loses its meaning and that is what ranking achieves.
"It seems academics approve of our approach; the number of responses to the survey has increased more than fivefold since 2021." I think that sums up everything we need to know about the quality of this data. We don't participate because we like what you're doing, we participate because if we don't we'll lose out to others who are better at gaming the system.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs
ADVERTISEMENT