ÁñÁ«ÊÓƵ

Impact Rankings 2023: methodology

<ÁñÁ«ÊÓƵ class="standfirst">The Times Higher Education Impact Rankings measure global universities¡¯ success in delivering the United Nations¡¯ Sustainable Development Goals. Here, we explain how we arrived at the results
May 25, 2023
Impact Rankings logo and icon

Browse the full Impact Rankings 2023 results.?To?participate in?next year¡¯s Impact Rankings,?email us

The Times Higher Education Impact Rankings are the only global performance tables that assess universities against the United Nations¡¯ Sustainable Development Goals. We?use carefully calibrated indicators to provide comprehensive and balanced comparisons across four broad areas: research, stewardship, outreach and teaching.

Definitions of areas

Research: the most obvious and traditional way that a university might help to deliver the SDGs is by creating research in relevant topics.

Stewardship: universities are custodians of significant resources; not just physical resources, but also their employees, faculty and students. How they act as stewards is one of the key factors in delivering the SDGs.

Outreach: place is critical in higher education, and the work that universities do with their local, regional, national and international communities is another key way that they can have an impact on sustainability.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Teaching: teaching plays a critical role, both in ensuring that there are enough skilled practitioners to deliver on the SDGs, and in making sure that all alumni take forward the key lessons of sustainability into their future careers.

Which SDGs are included?

There are 17 UN SDGs, and we are evaluating university performance on all of them.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Universities can submit data on as many of these SDGs as they are able. Each SDG has a?series of metrics that are used to evaluate the performance of?the university in?that SDG.

Any university that provides data on SDG 17 and at least three other SDGs is included in?the overall ranking.

As well as the overall ranking, we also publish the results of each individual SDG in 17 separate tables.

How is the ranking created?

A university¡¯s?total score in a given year is calculated by combining its score in SDG?17 with its best three results on the remaining 16 SDGs. SDG?17 accounts for 22?per cent of the?total score, while the other SDGs each carry a weighting of 26?per cent. This means that different universities are scored based on a different set of SDGs, depending on their focus.?The score for the overall ranking is an average of the last two years¡¯ total scores.

The score from each SDG is scaled so that the highest score in each SDG in the overall calculation is 100 and the lowest score is 0. This is to adjust for minor differences in the scoring range in each SDG and to ensure that universities are treated equitably whichever SDGs they have provided data for. It is these scaled scores that we use to determine which SDGs a?university has performed in most strongly; they may not be the SDGs in which the university is ranked highest or has scored highest based on unscaled scores.

The metrics for the 17 SDGs are included on their individual methodology pages.

Scoring within an SDG

There are three categories of metrics within each SDG:

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Research metrics are derived from data supplied by Elsevier. For each SDG, a specific query has been created that narrows the scope of the metric to publications relevant to that SDG. This is supplemented by additional publications identified by artificial intelligence. As with the World University Rankings, we are using a five?year window between 2017 and 2021. The only exception is the metric on patents that cite research under SDG?9, which relates to the time frame in which the patents were published rather than the time frame of the research itself. The metrics chosen for the bibliometrics differ by SDG and there are always at least two bibliometric measures used.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Continuous metrics measure contributions to impact that vary continually across a range ¨C for example, the number of graduates with a health-related degree. These are usually normalised to the size of the institution.

When we ask about policies and initiatives ¨C for example, the existence of mentoring programmes ¨C our metrics require universities to provide the evidence to support their claims. In these cases, we give credit for the evidence, and for the evidence being public. These metrics are not usually size normalised.

Evidence is evaluated against a set of criteria, and decisions are cross-validated where there is uncertainty. Evidence need not be exhaustive ¨C we are looking for examples that demonstrate best practice at the institutions concerned.

Time frame

In general, the data used refer to the closest academic year to January to December 2021. The date range for each metric is specified in the full methodology document.

Exclusions

The ranking is open to any university that teaches at either undergraduate or postgraduate level. Although research activities form part of the method?ology, there is no?minimum research requirement for participation.

THE reserves the right to exclude universities that it believes have falsified data, or are no?longer in good standing.

Data collection

Institutions provide and sign off their institutional data for use in the rankings. On the rare occasions when a particular data point is not provided, we enter a value of zero.

The methodology was developed in conjunction with our partners and Elsevier, and after consultation and input from individual universities, academics and sector groups.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

View the full methodology for the THE Impact Rankings 2023 .

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (5)
Hello. If a university implements ESG metrics and receives an ESG rating, will this help it rank better in the Times in achieving the UN Sustainable Development Goals?
In the Impact Rankings, we ask for questions relating directly to the UN's SDG targets tailored specifically for higher education institutions. Some of the ESG metrics may not correlate with this. You can review our methodology for more information on what we request: https://the-ranking.s3.eu-west-1.amazonaws.com/IMPACT/IMPACT2024/THE.ImpactRankings.METHODOLOGY.2024.pdf
Hello. There is some difference/inconsequence between scores in the 'printed' version of Impact rankings (in tables for top 1000 universities) and on-line data. [i can indicate which scores in particular differ for 3 universities from Latvia] which data are more accurate and could be used for reference? i would prefer 'printed' ones since they give exact values instead of interval available in on-line data
Hello. Both should be accurate. Please note this section of the methodology: In the overall table, the score from each SDG is scaled so that the highest score in each SDG in the overall calculation is 100 and the lowest score is 0. This is to adjust for minor differences in the scoring range in each SDG and to ensure that universities are treated equitably whichever SDGs they have provided data for. It is these scaled scores that we use to determine which SDGs a university has performed in most strongly; they may not be the SDGs in which the university is ranked highest or has scored highest based on unscaled scores. However, if there is a discrepancy then the online version should be used for reference as occasionally there are corrections. Please send over the differences to ellie.bothwell@timeshighereducation.com and I will get back to you.
I am writing to enquire about the rationale behind the range of scores provided, rather than a net score. This approach makes it difficult for us to analyse our own university's performance in the results. It would be more useful if you could provide the Institution ID for each institution. If only we could rate institutions based on IDs, like researcher and Scopus IDs, it would be much easier. -- Necmettin Erbakan University Big Data Office
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs
ADVERTISEMENT