ÁñÁ«ÊÓƵ

THE Impact Rankings 2020: methodology

<ÁñÁ«ÊÓƵ class="standfirst">The Times Higher Education Impact Rankings measure global universities¡¯ success in delivering the United Nations¡¯ Sustainable Development Goals. Here, we explain how we arrived at the results
April 17, 2020

Browse the full Impact Rankings 2020 results


The Times Higher Education Impact Rankings are the only global performance tables that assess universities against the United Nations¡¯ Sustainable Development Goals (SDGs). We use carefully calibrated indicators to provide comprehensive and balanced comparisons across four broad areas: research, stewardship, outreach?and?teaching.

Definitions of areas?

Research: the most obvious, and traditional way that a university might help to deliver the SDGs is by creating research in relevant topics.

Stewardship: universities are custodians of significant resources; not just physical resources, but also their employees, faculty, and students. How they act as stewards is one of the key factors in delivering the SDGs.

Outreach: place is critical in higher education, and the work that universities do with their local, regional, national and international communities is another key way that they can have an impact on sustainability.

ÁñÁ«ÊÓƵ

Teaching:?teaching plays a critical role, both in ensuring that there are enough skilled practitioners to deliver on the SDGs, and in making sure that all alumni take forward the key lessons of sustainability into their future careers.

Which SDGs are included?

There are 17 UN SDGs and we are evaluating university performance on all of them in our second edition of the ranking (click on a category below to view its specific methodology):

ÁñÁ«ÊÓƵ

Universities can submit data on as many of these SDGs as they are able. Each SDG has a series of metrics that are used to evaluate the performance of the university on that SDG.?

Any university that provides data on SDG 17 and at least three other SDGs is included in the overall ranking.

As well as the overall ranking, we also publish the results of each individual SDG in 17 separate tables.

How is the ranking created?

A university¡¯s final score in the overall table is calculated by combining its score in SDG 17 with its top three scores out of the remaining 16 SDGs. SDG 17 accounts for 22 per cent of the overall score, while the other SDGs each carry a weight of 26 per cent. This means that different universities are scored based on a different set of SDGs, depending on their focus.

The score from each SDG is scaled so that the highest score in each SDG in the overall calculation is 100. This is to adjust for minor differences in the scoring range in each SDG and to ensure that universities are treated equitably, whichever SDGs they have provided data for. It is these scaled scores that we use to determine which SDGs a university has performed most strongly in; they may not be the SDGs in which the university is ranked highest or has scored highest based on unscaled scores.

The metrics for the 17 SDGs are included on their individual methodology pages.

ÁñÁ«ÊÓƵ

Scoring within an SDG

There are three categories of metrics within each SDG:

Research metrics are derived from data supplied by Elsevier. For each SDG, a specific query has been created that narrows the scope of the metric to papers relevant to that SDG. As with the World University Rankings, we are using a five-year window between 2014 and 2018. The only exception is the metric on patents that cite research under SDG 9, which relates to the timeframe in which the patents were published rather than the timeframe of the research itself. The metrics chosen for the bibliometrics differ by SDG and there are always at least two bibliometric measures used.

ÁñÁ«ÊÓƵ

Continuous metrics measure contributions to impact that vary continually across a range ¨C for example, the number of graduates with a health-related degree. These are usually normalised to the size of the institution.

When we ask about policies and initiatives ¨C for example, the existence of mentoring programmes ¨C our metrics require universities to provide the evidence to support their claims. In these cases, we give credit for the evidence and for the evidence being public. These metrics are not usually size normalised.

Evidence is evaluated against a set of criteria and decisions are cross validated where there is uncertainty. Evidence is not required to be exhaustive ¨C we are looking for examples that demonstrate best practice at the institutions concerned.

Timeframe

Unless otherwise stated, the data used refer to the closest academic year to January to December 2018.

Exclusions

Universities must teach undergraduates and be validated by a recognised accreditation body to be included in the ranking.

Data collection

Institutions provide and sign off their institutional data for use in the rankings. On the rare occasions when a particular data point is not provided, we enter a value of zero.

ÁñÁ«ÊÓƵ

The methodology was developed in conjunction with our partners??and Elsevier, and after consultation and input from individual universities, academics, and sector groups.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (8)
Given the existence of gender and ethnic inequalities in many universities across the world, it is necessary to make the following adjustments to the SDGs for future rankings: a.) Remove Reduced Inequalities with Reduced Ethnic Inequalities; b.) Change Gender Inequalities to Reduced Gender Inequalities; c.) Change Peace to Reduced Institutional Conflicts and put this as a separate SDG; d.) Put Justice and Institutional Strength as separate SDGs because the indicators of each are completely different. Given the negative impact of of gender and ethnic inequalities on health and wellbeing, then such universities should be placed on th lower end of the rankings for sDG3 Health and Wellbeing and SDG16 Peace and Justice. Also future university evaluations for ranking purposes should be conducted by different teams of researchers for each year. The same team should not conduct consecutive evaluations.
In the Impact Rankings Masterclasss on April 22nd 2020, it was mentioned that there were Four Aspects of Theory of Change: Research, Teaching, Stewardship and Outreach. Here it is mentioned that there are three categories of metrics. Can you please explain?
Thanks for your feedback. We have now updated this page and the four areas are listed at the top.
I've heard at other meetings that teaching is hard to integrate and doing so requires more work.
Can you please provide an update on the inclusion of Teaching (and associated methodological changes if any) for the next year's rankings across all SDGs but also specifically for SDG-11?
Hi, this page doesn't actually define what each of research, outreach and stewardship entail. Also, two other commentators above stated, it looks like these areas have been redefined as Research, Teaching, Stewardship and Outreach. Is this correct? If so, how is each defined? Thanks!
Thanks for your feedback. We have now updated this page and the four areas and definitions are listed at the top.
The UN SDGs are very important naturally, but they are not the sole preserve of universities. There are many other players and factors that are present. Are we in danger of making universities lose sight of their primary functions (which can include SDG 4 of course) over which it is they who have the most control, in their pursuit of ticking all the boxes to get higher in the university rankings?
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs