It is now just over a year since we first formally launched the idea of our Times Higher Education University Impact Rankings. Although we had been discussing ideas related to it for a number of years, in particular around measuring innovation, it took us a long time to get to the stage where we felt that we had a framework that could, potentially, work.
And so, at the THE Young Universities Summit 2018, hosted by the University of South Florida, we presented our draft.
At the time, I was expecting that this might be a relatively niche ranking; perhaps we would gather enough data for a top 100, and if 50 institutions submitted data then the experience would still have been worthwhile.
In the end, 561 universities across 75 countries participated, of which 467 met the criteria for inclusion in the overall ranking (but all were included in at least one of the individual SDG tables).
Designing the ranking
A key challenge was to design metrics to showcase how universities were working towards the United Nations’ Sustainable Development Goals. While the 17 goals themselves include 169 indicators and 223 targets, there are among them very few direct mentions of universities.
Working from an underpinning theory of change – how could a university work towards the SDGs? – we created a framework with three elements: research, stewardship and outreach.
For research, we worked with Elsevier to identify a series of metrics and to devise a query of keywords for each SDG.
Universities are stewards of significant resources, both physical and human, and in this area we looked for indicators that institutions were matching their assets to the goals.
Finally, metrics around outreach had to recognise the institution’s place in its local community, region or nation.
Making it fair – or as fair as we could
There was always a risk that we would build something that replicated existing norms – that large, usually Western, universities would be the only ones with the capacity to fare well in the rankings.
As a result, we decided on the following conditions:
- We would allow universities to participate even if they only submitted data on a small number of SDGs chosen by the institution
- We wouldn’t prioritise one SDG over another
- We would welcome entries from as wide a selection of universities as possible.
We also decided to reduce the emphasis on bibliometrics, from 38.5 per cent in the THE World University Rankings to 27 per cent.
The result
We saw significant variation in the SDGs for which institutions submitted data, most notably by geography. This vindicated our decision to make entry possible by participating in only a small number of SDGs.
We also saw universities that do not usually feature in rankings participating and performing well.
What happens next?
Although I feel positive about the initial iteration of the ranking, it is only a first version. We know that there is room for improvement. Some of the things that we will be working on for the next editions include:
- Designing metrics for the remaining six SDGs that were not covered in the first edition
- Making it easier for universities to provide data and evidence, including creating guides in multiple languages
- Examining the results from the first round to see which metrics worked well and which did not perform as we expected
- Consulting with universities and organisations around the world to identify opportunities for improvements
- Reaching out to universities that were unable to participate in the first year
- Providing insights from the data we have collected.
We will consult with institutions this summer, and we hope to generate an early first draft of the new methodology soon. This will enable us to formally present next year’s methodology at the THE World Academic Summit in September, before opening data collection in October.
The second THE University Impact Rankings will be launched at the THE Innovation and Impact Summit at Sweden’s KTH Royal Institute of Technology in April 2020.
Duncan Ross is chief data officer at?Times Higher Education.