ÁñÁ«ÊÓƵ

NSS results unrelated to teaching quality, study claims

<ÁñÁ«ÊÓƵ class="standfirst">Research by University of Oxford academics provides evidence that student satisfaction scores are unconnected to exam performance
September 17, 2015
doctors running medics students ER clooney
Source: istock
Exam results of trainee doctors have raised questions over the usefulness of student satisfaction scores when trying to determine teaching quality

Student satisfaction scores should not be used to measure teaching quality because they have no discernible link with exam performance, a leading University of Oxford academic has claimed.

While results from the National Student Survey are likely to be used as a key indicator in the government¡¯s proposed teaching excellence framework, by Tim Lancaster, director of clinical studies at Oxford, concludes that they have ¡°little or no value as a quality metric¡± .

Dr Lancaster compared the NSS results of 28 UK medical schools with the average pass rates achieved by their students in exams sat by all trainee doctors two years after graduation.

There was no correlation between good results in the NSS and performance in the exams set by the General Medical Council, according to the study ¡°Assessing the quality of UK medical schools: what is the validity of student satisfaction ratings as an outcome measure?¡±, which is due to be published shortly by Dr Lancaster.

ÁñÁ«ÊÓƵ

When medical schools were ranked by success according to both the NSS results and their pass rates, based on average scores achieved between 2008 and 2014, 13 schools did better on examinations than they did in the NSS and 14 did worse, the study says.

Only one medical school performed strongly on both measures, scoring top in both the NSS and on pass rates, according to the study undertaken with Tom Fanshawe, a statistician at Oxford¡¯s Nuffield Department of Primary Care Health Sciences.

ÁñÁ«ÊÓƵ

¡°One medical school with one of the highest NSS scores had the worst exam results,¡± Dr Lancaster, a GP and fellow at St Anne¡¯s College, told Times Higher Education.

¡°NSS scores can still be relevant because they provide information about whether your students are happy or not, but [they do] not appear to correlate with teaching quality,¡± he added.

His study did suggest, though, that there is a strong correlation between institutions¡¯ average entry scores for students ¨C in terms of exams such as A levels ¨C and subsequent medical assessment pass rates.

The study had wider significance for academia because medicine was one of the few subjects where a standardised ¡°common national exam¡± enabled comparisons between students at different universities, Dr Lancaster said.

ÁñÁ«ÊÓƵ

With universities minister Jo Johnson currently drawing up plans for the TEF, which will allow some institutions to raise tuition fees from 2017-18, the study¡¯s findings are likely to stoke debate over which metrics should be included.

One of the NSS¡¯ architects, Paul Ramsden, former chief executive of the Higher Education Academy, has argued that the survey¡¯s scores are a ¡°proxy for learning gain¡± because students who report better experiences gain better degrees, even controlling for entry scores.

But critics say that the NSS has fuelled grade inflation because students return positive results to lecturers who hand out higher marks.

Dr Lancaster concludes that if ¡°the overriding measure of the success of a medical school is its ability to graduate competent doctors, the NSS appears to have little or no value as a quality metric for the teaching excellence framework¡±.

ÁñÁ«ÊÓƵ

jack.grove@tesglobal.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related universities
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (3)
There is a huge problem with this article in that it treats teaching quality and exam performance as if they are synonymous when the reality is that there is a much more complicated relationship between the two.
@ Marge Bolton. I think that the article's intent was to challenge the assumption that "survey¡¯s scores are a proxy for learning gain". And I believe it does quite a good job in this respect. It seems obvious to me that if students do well in exams they are more likely to be "satisfied". Then, if the NSS is considered a proxy for teaching quality, most Universities will tend to inflate grades in order to perform well in quality assessment exercises. I must say, as a Lecturer I have often experienced grade inflation and all other kinds of adjustments/rescaling. Everyone who has taken part in exam board meetings and related moderation exercises knows this well (most academics will never admit it, but we all know that it is how the system works ....)
Although any attempt to fully understand the meaning of NSS scores is useful to an extent, a more helpful correlation would be between the exam scores of individual students and their NSS scores. Aggregation at departmental level makes this study a very blunt instrument with which to critique the relationship between quality and satisfaction. I should also (pedantically but importantly) point out that students are not asked to state their level of agreement on a numeric scale of 1-5. The NSS uses an ordinal scale, not a linear scale. Therefore the study uses numerically calculated means inappropriately. This is a familiar trap when working with student survey datasets.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs