ÁñÁ«ÊÓƵ

NSS overhaul shows ¡®dangerous¡¯ drift towards graduate outcomes

<ÁñÁ«ÊÓƵ class="standfirst">Some criticisms around survey may be justified but there are major risks in a wholesale replacement, warn experts    
September 14, 2020
National Student Survey 2018

A major overhaul of the UK¡¯s National Student Survey is further evidence of a ¡°dangerous¡± drift towards using graduate employment outcomes as the main measure of course quality, it has been warned.

As part of a? on cutting red tape, the Westminster government said that it had ordered the Office for Students to carry out a ¡°radical, root and branch¡± review of the NSS because of ¡°valid concerns¡± that it drove down standards, could be gamed by institutions and had results that did not ¡°correlate well with other, more robust, measures of quality¡± such as employment outcomes.

Paul Ashwin, professor of higher education at Lancaster University, said while ¡°at one level¡± he could understand the criticism of how some universities have tried to manage their NSS performance, he had been ¡°shocked¡± by some of the assertions made in the document without firm evidence.

He said that there seemed to be a determination to see course quality only in terms of graduate employment and salaries, which often related to other factors such as students¡¯ social background or where they worked.

ÁñÁ«ÊÓƵ

¡°It just seems to be a move to make graduate outcomes the main measure of quality in the sector and that is so dangerous when we know that doesn¡¯t relate to quality,¡± he said.

Camille Kandiko Howson, associate professor of education at Imperial College London, said although in her view criticisms about gaming were ¡°completely justified¡±, relying on graduate outcome measures was also problematic.

ÁñÁ«ÊÓƵ

¡°The criticism about the correlation with other metrics is less substantiated,¡± she said, adding that the government¡¯s own graduate employment data showed that ¡°there is a danger in relying on outcome measures that are largely based on student intake to judge quality in the sector¡±.

However, she added that although a radical change to the NSS would have a ¡°huge impact¡± given that it had become so ¡°enmeshed¡± in the sector, it did not mean reform ¡°cannot be for the better¡±, adding that the survey had hindered ¡°innovation in student feedback, quality enhancement and quality assurance¡±.

Rachel Hewitt, director of policy and advocacy at the Higher Education Policy Institute, said one of her concerns was how student views would be properly accounted for without a wide-ranging survey.

There was also a risk that a scaled-down NSS in terms of sample size and frequency ¨C which the government seems to want ¨C would be unable to provide data at a granular enough level.

ÁñÁ«ÊÓƵ

¡°If you want to look¡­at NSS data at course level [in an] institution¡­you need to have quite high sample size ¨C if not a census of all students ¨C in order to break it down at those levels,¡± she said.

simon.baker@timeshighereducation.com

<ÁñÁ«ÊÓƵ class="pane-title"> POSTSCRIPT:

Print headline: NSS overhaul going in ¡®dangerous¡¯ direction

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (4)
An NSS overhaul is well over due - it has resulted In lower academic standards and grade inflation. Of course students want greater feedback and outline answers to make life easier. The consumer agenda undermines academic integrity.
Perhaps the best thing would be to reduce the importance that is placed on the NSS. It's a mere indicator, at best, of how students view their university experience... and at that a flawed one as it doesn't ask the right questions. Nothing about "How well has my course prepared me for what I want to do next?" for example.
Getting rid of the NSS won't get rid of the consumerist agenda - that remains in place as long as we have student fees and Consumer Protection Legislation, and both of those are greater drivers of the consumerist agenda - and replacing it as a measure of 'quality' (it has only ever measured satisfaction) with Graduate Outcomes and Continuation is objectively worse. The former narrowly redefines the purposeof HE as helping one find a good job (and is a measure driven primarily by social background) and the latter will push universities to keep students on at all costs (thereby harming academic standards). The only people celebrating this change are the ones who want a reduced HE sector to be about jobs alone, and the ones who haven't engaged with the detail. There's plenty to dislike about the NSS; these changes don't make things better (any more than the introduction of the OfS has made things better).
"¡°If you want to look¡­at NSS data at course level [in an] institution¡­you need to have quite high sample size ¨C if not a census of all students ¨C in order to break it down at those levels,¡± she said." The lack of validity of the NSS is not about sample size - it is about the lack of its construct validity; there is *zero* evidence that it measures teaching quality nor efficacy. Repeat after me.... zero empirical evidence (see a recent meta analysis by Uttl et al., 2017). Sure, use it to measure student experience - I am sure Disneyland measures their visitors' experience in the same way as well. After all, learning in HE is all about keeping our customers happy and entertained like a visit to Disneyland and the movies, right? It is amazing that after decades of research in assessments and testing, some people still advocate the use of a measure that has no documented evidence of its validity. Makes me wonder if there is some hidden personal agenda or advantage for these people (e.g., holds a leadership or administrative post that relies on the continued use of NSS?). Uttl, B., White, C. A., & Gonzalez, D. W. (2017). Meta-analysis of faculty's teaching effectiveness: Student evaluation of teaching ratings and student learning are not related. Studies in Educational Evaluation, 54, 22-42.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs