It’s become a cliché to start a discussion of university rankings with the line: “Love them or hate them…” But in today’s climate it is right to acknowledge that plenty of people in higher education now simply love to hate them.
University rankings have been under a great deal of scrutiny lately, with some even facing boycotts. But at Times Higher Education we are proud to have secured unprecedented participation in our forthcoming World University Rankings. The 20th?annual edition of the THE World University Rankings, published on 27 September, will have almost 2,000 universities included, after a record 2,673 universities from 127 countries volunteered for assessment.
So why are Times Higher Education’s world rankings avoiding the pushback faced by other systems? I’d say it is because while most university rankings are done to universities, THE’s are done with universities. Here are 10 ways THE has worked collaboratively over the past 20 years to ensure our global rankings meet the needs of the sector today and will continue to do so for the next 20 years and beyond.
1. Our rankings are?voluntary
THE is proud that the universities included in our suite of international rankings voluntarily sign up to be included in the analyses. To be ranked, universities register for?THE’s secure data collection portal and submit and sign off core institutional data. This builds full trust and transparency.?
2. Our rankings offer deep data insights
THE’s rankings are not an exercise in media “clickbait” or a marketing-led beauty parade. They are a serious piece of research, trusted not just by prospective students but by university leaders to set strategy and by governments to inform policy. This year’s world rankings will be built on 17 separate performance indicators covering the full range of a global, research-intensive university’s missions: teaching, research, knowledge transfer and international outlook. Some 16.5 million research papers and 134 million citations of those papers have been examined to bring you the 2024 world rankings.
3. We ensure that arts, humanities and social sciences are given equal value to STEM subjects
THE’s world rankings were developed to represent the full diversity of global research universities, respecting their different sizes, histories and research profiles. While some global rankings’ analyses of research paper citations favour more highly cited STEM fields, THE ensures that citations data is fully normalised for more than 300 fields, as well as by publication type and publication year. This means that impactful work in fields that are innately less highly cited, such as arts and humanities, are treated fairly.
4. Our academic reputation data is statistically representative of global scholarship
THE uses an annual survey of academic reputation to inform the rankings, and September’s ranking will include data from 68,402 survey respondents from 166 countries, giving us almost 1 million votes for 6,600 universities. But a large sample is inadequate if it is not truly representative. THE takes steps to ensure only peer-reviewed, published scholars are surveyed, demonstrating expert subject knowledge, and that our survey samples are statistically representative of the true geography of research. No volunteers or nominations are allowed in our strictly invitation-only system, which THE manages directly in-house in 12 languages.
5. We have invested significantly in an award-winning data team
While many may know THE primarily for its journalism, the rankings are not a journalistic exercise. The rankings are built by a team of more than 20 data specialists, led by chief data officer Duncan Ross, one of DataIQ’s top 100 most influential people in data, and David Watkins, this year named data science managing director of the year by SME News. They take full, in-house responsibility for all our rankings.
6. We are clear that there is no single ‘correct’ ranking or a definitive ‘best’ university
While the THE World University Rankings are often seen as the international “gold standard”, we are clear that they are by no means definitive – they focus only on universities with a significant, global research mission and the indicators are set accordingly. Over the years we have significantly democratised rankings, with separate rankings focussed on the student experience and student success, while others provide clear regional context (for example our Sub-Saharan Africa University Rankings in partnership with the Mastercard Foundation). Another pioneering exercise focuses on the social and economic impact of universities through the lens of the United Nations’ Sustainable Development Goals. There is no one size-fits-all set of metrics and indicators.
7. We pride ourselves on transparency
In 2009, when we separated from an initial early rankings partner, THE not only carried out a global survey of hundreds of rankings users to understand their needs, we appointed a 50-strong, international advisory board to feed into a one-year consultation process to develop a fresh methodology. We have prided ourselves on our openness ever since: inviting professional services firm PricewaterhouseCoopers to audit the rankings calculations; launching rankings at masterclass sessions with detailed, public scrutiny of the data; and lately, setting up fresh advisory boards to oversee developments across our range of specialist and regional rankings.
8. We are committed to global inclusion
The first edition of the world rankings in 2004 contained just 200 universities, primarily from North America and Western Europe, with no African or Latin American universities included. September’s 20th?edition will have more Asian universities than any other continent, and will include 119 from South America and 113 from Africa. Our metrics, while reflecting the geopolitical reality of an unequal world, have geographical fairness built in: with purchasing power parity applied to financial data, for example; with surveys going out in multiple languages to a truly representative global sample; and with the scaling of most metrics size-independent. Overall, across our range of rankings, THE has data on almost 7,000 institutions representing a vast and diverse cross section of the global sector.
9. We listen to our community and continue to innovate
Like the dynamic sector itself, we never stand still. The world rankings published in September will be the 20th annual edition, but also the third variant of our methodology. While we recognise the importance of continuity for longitudinal analysis, THE’s data team is always on top of the latest data sources and analytical techniques and aware of feedback from the international community. As well as keeping the world rankings formula up to date we have also innovated with new metrics covering a huge range of missions beyond the world rankings – from the student experience to social mobility to sustainability.
10. We know that universities are much, much more than their rank
The data that underpins our portfolio of rankings covers a vast array of vital university activities: teaching, research, knowledge transfer, financial health, reputation, inclusion, student outcomes, social and economic impact, and sustainability.?They offer a sophisticated set of benchmarks and insights that can be tailored to suit multiple missions in many different contexts. But we also fully understand that data alone can never capture all the extraordinary ways that universities contribute to our world.
That’s why, alongside our rankings and data, we bring together our international community at multiple summits and events across the world each year, for face-to-face discussion and debate on the role, responsibilities and direction of the world’s universities. It is also why we have created the open, peer-to-peer THE Campus community for sharing resources and why we are committed to building on more than 50 years of outstanding journalism to provide the analysis, interpretation and insights that the sector needs alongside, and beyond, the data.
Phil Baty is chief global?affairs officer at?Times Higher Education. Participate in our rankings.