ÁñÁ«ÊÓƵ

Rankings are changing: WUR 3.0 will be more robust and insightful

<ÁñÁ«ÊÓƵ class="standfirst">Next year will see a significant update to our methodology to better reflect universities¡¯ work. Duncan Ross and Billy Wong explain
October 19, 2022
Source: Getty

Editor's note:?This article refers to methodological updates to the THE World University Rankings. Our ¡®WUR 3.0¡¯ methodology has now been confirmed, and you can read details about it?here.

?

World University Rankings 2023: results announced

Browse the full results of the World University Rankings 2023

Download a copy of the World University Rankings 2023 report

Times Higher Education¡¯²õ World University Rankings are the leading global league tables of university performance. Created in 2004, the rankings have grown from just 200 institutions to just under 1,800. They rank research-intensive institutions across a wide range of metrics, which fit under the four pillars of teaching, research, industry links and internationalisation.

ÁñÁ«ÊÓƵ

Since the rankings were launched 18 years ago, the methodology has been tweaked several times but altered substantially just once, in?2011. Those changes ensured that it remained robust as the global higher education landscape expanded and became more international. We believe it is now time for another significant update to the methodology, so that it continues to reflect the outputs of the diverse range of research-intensive universities across the world, now and in the future. We¡¯re in a?world where students and faculty increasingly look to travel to study and work, and where the focus isn¡¯t exclusively on the research output of universities.

The new iteration of the methodology will focus on three main areas:

ÁñÁ«ÊÓƵ

  • A wider range of bibliometric measures that gives more insight and stability from year to year
  • Improvements to the international metrics to better reflect country size and diversity
  • An expanded role for knowledge transfer, including a new metric.

Bibliometric measures

The World University Rankings focus heavily on research-intensive institutions across the world. As such, the quality of their research outputs is one of the most important measures. Currently we use average field-weighted citation impact (FWCI) to assess the overall research quality of an institution.

The FWCI of a publication measures the number of times it is cited by other academic publications, compared with the average number of citations of publications of the same type, the same subject and the same year of publication. The FWCI can range from?0, for?publications that have never been cited, to?infinity. The average FWCI of all publications is?1.

The average FWCI of an institution is the arithmetic mean of the FWCI values of all of its academic publications within a particular window. The World University Rankings use a five-year window.

The FWCI had served us well for many years as a measure of overall quality. Over time, however, more and more up-and-coming institutions participate in the rankings. These institutions tend to have fewer publications. Their average FWCI performance is thus more susceptible to undue influence from a small number of papers with an exceptionally high?FWCI.

We are replacing the current citation metric based on the FWCI with three new metrics, each measuring a different aspect of research quality.

We want to continue to measure the average research quality based on the FWCI, but without being skewed by papers with an exceptionally high FWCI. Therefore, we will measure the 75th percentile FWCI instead. According to our internal modelling, this measure is robust against outliers and better reflects the average research quality than the arithmetic mean.

In addition to that, we also want to measure the amount of excellent research outputs from institutions. Excellence here is defined as the best 10?per cent of publications by FWCI. An institution¡¯²õ performance in this measure is calculated as its number of publications within the top 10?per cent, normalised by the size of the institution.

Finally, we want to measure the level of thought leadership, or influence, an institution has in each subject. This is different from traditional FWCI calculations in the sense that we consider not only the number of citations, but also how influential the citing papers are. This is similar in idea to Google¡¯²õ PageRank algorithm.

ÁñÁ«ÊÓƵ

International outlook

We believe that internationalisation deeply strengthens universities. Knowledge is inherently global, and the collaboration of people from different backgrounds makes for a better learning and research environment.

ÁñÁ«ÊÓƵ

To help us evaluate this, we currently have three metrics: proportion of international students; proportion of international staff; and proportion of publications with at least one co-author from an international institution.

For the next generation of our rankings, we will introduce a new metric and make some changes to the current ones.

For several years, we have been producing a teaching-focused ranking of Japanese universities: the Japan University Rankings. As well as including metrics on the proportion of international students and international staff, this ranking examines another aspect of internationalisation ¨C the provision of international learning opportunities for domestic students. We measure this by collecting data on the number of students on international exchange programmes (in?essence, a?measure of?outbound student exchange). This has proved a useful data point, and we will include this metric in the World University Rankings.

For international metrics, large countries ¨C some of which may be home to many diverse cultures ¨C are at a disadvantage when compared with smaller countries and regions. Students can travel hundreds of kilometres in the US and still be in the same country. That is a luxury that doesn¡¯t exist in Luxembourg or Qatar or many smaller nations. The result: it is ¡°easier¡± for universities in Luxembourg and Qatar to do well in these measures.

After much experimenting, we think a better approach is to adjust the data to account for population size. This moderates the impact of size discrepancies while still providing valuable insight into the positive role of international activity.

Knowledge transfer

One of the measures of success of a higher education institution is its positive impact on the wider society. In our current methodology, we use the amount of research funding provided by industry as a proxy for the knowledge transfer between industry and academia. However, this measure alone is not sufficient as funding is an input measure and knowledge transfer is an output measure.

We will introduce a new metric that measures how much of an institution¡¯²õ research is cited by patents. This provides a more direct measure of knowledge transfer, and is one that we already use within the THE?Impact Rankings (in SDG?9: Industry, Innovation and Infrastructure).

Moving forward

The first edition of the World University Rankings that implements the new methodology described above will be the 2024 edition, to be launched in early autumn next year.

These changes, together with additional work we are undertaking to improve the calculation approaches, will make the World University Rankings more insightful and robust, but with minimum additional data collection requirements for universities.

We also hope to be able to open the World University Rankings, and associated subject rankings, to more universities to reflect a broader view of research-based higher education.

Duncan Ross is chief data officer, and Billy Wong is principal data scientist, both at Times Higher Education.

ÁñÁ«ÊÓƵ

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles

From the information that we gather from institutions across the globe, we rigorously compile our World University Rankings. This is how we assess data on about 1,800 institutions to produce the tables

5 October
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (2)
THE methodology definitely needs an update. When you filter the result by my country Turkey, an unknown university in Turkey (?ankaya University) is listed as number 1! In a better listing by a major research group in Turkish university (Middle East Technical University) lists ?ankaya University as 31st. which makes more sense. https://newtr.urapcenter.org/Rankings/2022-2023/GENEL-SIRALAMASI-2022
Also, while some universities pay more attention to robust research to rank better, the quality of teaching and learning activities may be suffering in some cases. There should be a way of including some input on the quality of pedagogical approaches. Having international staff or students for some small Gulf countries is easy but this may not mean that quality of education is high.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs