ÁñÁ«ÊÓƵ

World University Rankings 2024: changes to our methodology

<ÁñÁ«ÊÓƵ class="standfirst">The 20th edition is our most robust, inclusive and global ranking, thanks to significant improvements to our methodology
September 20, 2023
stages of butterfly metamorphosis to illustrate the changes to the rankings methodology over the past 20 years
Source: iStock montage

Browse the full results of the World University Rankings 2024

The Times Higher Education World University Rankings have changed significantly since the first edition in 2004. But then so has the world of higher education.

Higher education has become more international and less focused on the wealthier nations (although they still account for most of the top universities). The US dominates slightly less, Asia slightly more.

The World University Rankings have grown from only 200 institutions to just over 1,900 this year.

This year¡¯s ranking ¨C the 20th edition ¨C marks the second time that the methodology has been significantly updated (the last substantial alteration was in the 2011 edition). We believe these changes are necessary so that the ranking continues to reflect the outputs of the diverse range of research-intensive universities across the world, now and in the future. The decisions were made after extensive open discussion and consultation, and they have been carefully considered and evaluated to maintain the robustness of our World University Rankings.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

The goal of the World University Rankings, though, remains the same: to help to explore which universities are the strongest in the world when it comes to the research mission. We take a broad look at research ¨C we still believe that the best research informs (and is informed by) teaching, is international, and links back to the needs of commerce and industry.

Pillars

We have retained the five pillars that guide our methodology, although we have renamed three of them:

ÁñÁ«ÊÓƵ

ADVERTISEMENT
  • Research becomes Research environment
  • Industry income becomes Industry
  • Citations becomes Research quality

We believe that these new names better reflect the metrics that the pillars contain.

Metrics

The biggest change to our metrics occurs within the Research?quality pillar. We used to have a single metric covering this area ¨C field-weighted citation impact. We have retained that metric, but at a much reduced level (it is now worth 15?per cent of the overall score, down from 30?per cent), and we have supported it with three new metrics:

  • Research strength (5%)?¨C a guide to how strong typical research is, based on the 75th?percentile of field-weighted citation impact
  • Research excellence (5%)?¨C a guide to the amount of world-leading research at an institution, based on the volume of research in the top 10% worldwide
  • Research influence (5%)?¨C a broader look at excellence, based on the volume of research recognised by the most influential research in the world.

Maintaining the existing measure increases year-on-year stability and, crucially, supports universities that are using it to track their own progress.

Joining the industry income measure in the Industry pillar is a new metric on patents. This measure explores how often a university¡¯s research is cited in patents. Both these metrics are worth 2?per cent in the ranking, bringing the overall weighting for this pillar up from 2.5?per cent to 4?per cent of our ranking.

As a result of this, we have slightly reduced the Teaching and Research Environment pillars from 30?per cent to 29.5?per cent and 29?per cent of the ranking, respectively.

Our two reputation-based metrics ¨C teaching reputation and research reputation ¨C benefit from our new reputation survey. This has grown from approximately 10,000 respondents annually to more than 35,000. As well as giving us greater visibility of academics¡¯ views, our new approach has enabled us to introduce a cap on the number of votes that can be given to an institution from academics at that same institution.

We still think that academics should be allowed to vote for their own university ¨C they are, after all, committing their future to the institution, often because they believe it is among the best in their field ¨C but this will now be limited to 10?per cent of the total number of votes cast for the university. This will have only a very limited effect in practice, but it does guard against the possibility of undue influence in the future.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Finally, we have altered the normalisation approach for the three measures in the International Outlook pillar to take account of the population of a?country.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Behind the scenes

Less visible are the changes we have made to the back end of the rankings process. But they are worth mentioning.

We have now fully transitioned to a new rankings engine, one made by and for our rankings team. This allows us to build, experiment and evaluate without the need to bring in our colleagues in software engineering ¨C enabling us to explore the data more easily and to be more responsive to the changing needs of the higher education sector.

Over the next few years, we will be rolling out a new data collection system, which we hope will make the lives of data submitters easier.

We have also devoted more time and energy to data validation than ever before. Every year, we have thousands of discussions with the institutional research teams at universities to validate information that has been submitted. This effort will be expanded as we go forward.

What impact have the changes had?

As expected, changing the methodology has an effect on the overall ranking. Having stress-tested the changes, we think they have made the results more reliable and have resolved some of the unusual edge cases that we have been (correctly) criticised for in the past, especially around some odd citations results.

The changes to the international metrics are more of a nudge than a huge change, to avoid penalising universities in countries with large populations.

As always when there are changes to the overall methodology, we urge readers to be careful when making direct comparisons to previous years.

I would like to thank the World University Rankings advisory board for their input, which has been thoughtful and definitely helped us to revisit some assumptions.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Duncan Ross is chief data officer at?Times Higher Education.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles

The methodology for the 20th edition of the World University Rankings has been significantly updated to reflect the outputs of the diverse range of research-intensive universities across the world

20 September
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs
ADVERTISEMENT