ÁñÁ«ÊÓƵ

Leader: Only the best for the best

<ÁñÁ«ÊÓƵ class="standfirst">Our world rankings are hugely influential but also come under criticism every year, so we have decided to improve them
November 5, 2009

When we published our sixth annual world rankings last month, we received more than 1 million visits to our website in just one day and made headlines around the world, from New York to New Zealand. We also received a fair amount of criticism, as we have done since their launch in 2004. One of the very first critics said: "All the talk about 'the best universities' sounds more like selling soap powder. The rankings gloss over the fact that if different measures were used, or with different weightings, different results would be obtained."

Then as now, there is no denying that accusation, and we are well aware that rankings can only ever be a crude measure of what universities do. But whatever you think of them, they are here and they are here to stay.

Global rankings have always been used by students to choose where to study, by staff to look at career opportunities and by research teams seeking new collaborative partners. They also help us to analyse and highlight trends and developments in a rapidly changing higher education landscape worldwide. But in recent years they have become extraordinarily influential, used by institutions to benchmark themselves against global competitors and even by governments to set their national higher education agendas.

The responsibility weighs heavy on our shoulders. We are very much aware that national policy and multimillion-pound decisions are influenced by these rankings. We are also acutely aware of the criticisms made of the methodology. Therefore, we feel we have a duty to improve how we compile them.

ÁñÁ«ÊÓƵ

To this end, the Times Higher Education editorial board met recently to discuss the problem. Two main flaws in the current rankings were identified: the survey of academic opinion that makes up 40 per cent of the overall score in our rankings was deemed too small - this year it was based on fewer than 4,000 responses from around the world, which when aggregated with previous years' results produces a total of 9,386; the other concern was about our use of paper citations to measure research quality.

The board felt that, in future, any survey of academic opinion needed to be more substantial, and that when using citations to measure quality, the very different volume of citations in different subject areas should be taken into account, so that, for example, world-leading institutions without medical schools are not disproportionately hit.

ÁñÁ«ÊÓƵ

With these criticisms in mind, we will work with our new data partner, Thomson Reuters, to produce a more rigorous and transparent ranking for 2010 and beyond. We will seek further advice from our editorial board and solicit the views of international rankings experts and our readers to develop a suitable methodology for the sector. And this week, Phil Baty, deputy editor of Times Higher Education and editor of the rankings, met experts in Shanghai at the Third International Conference on World-Class Universities to discuss the way forward.

Higher education is global. Times Higher Education is determined to reflect that. Rankings are here to stay. But we believe universities deserve a rigorous, robust and transparent set of rankings - a serious tool for the sector, not just an annual curiosity.

ann.mroz@tsleducation.com.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs