榴莲视频

National Student Survey changes: satisfaction out, free speech in

<榴莲视频 class="standfirst">Overall satisfaction question to go in shake-up of influential poll, regulator confirms
七月 28, 2022
satisfaction
Source: iStock

Students in England will no longer be asked how satisfied they were with their course, the regulator has confirmed as part of its proposed changes to the nation’s biggest higher education survey.

The Office for Students has outlined how it thinks the influential National Student Survey should operate from 2023 to ensure that it?“remains fit for purpose”.

Ideas put out for consultation include adding questions on freedom of expression and mental well-being, as well as moves to “ask more direct questions generally” and a shortening of the window in which the annual census of final-year undergraduate students is?run.

But the removal of Question?27, which looks at overall satisfaction, could prove to be the most significant change. The plan from the OfS – which also coordinates the NSS on behalf of regulators in the devolved nations – would see this summative question removed entirely in England and amended to lose the word “satisfaction” in Scotland, Wales and Northern Ireland, where regulators had expressed a desire to keep?it.

The change is needed because of concerns that the question detracts from the wider findings of the survey and is too consumerist in nature, according to a?consultation report that said there had been “significant concerns” about the wording and “no?consensus on what should replace?it”.

Question 27 could therefore be dropped just as the most recent iterations of the survey – conducted while the major review was being carried out – recorded that overall satisfaction has slipped to historically low levels, mostly because of the fallout from the pandemic.

The NSS has been run annually since 2005 and attracts about 300,000 responses a year. Its findings are used by regulators when assessing the quality of degree courses, while those institutions that have done well feature the results prominently on marketing materials.

The OfS said including questions on key topics such as free speech would allow the survey to reflect the growing importance of these issues on campuses and to track their impact on students.

The consultation includes an example of what a free speech question could look like, asking “During your studies, how free did you feel to express your ideas, opinions and beliefs?”, with options in response of “very free”, “fairly free”, “not very free”, “not at all free” and “this does not apply to me”.

The sector has been under pressure from the government to protect universities from “cancel culture”, with legislation currently working its way through Parliament.

Vice-chancellors have hit back at suggestions that free speech is under threat, although a?recent survey conducted by the Higher Education Policy Institute found that students today appear to be less tolerant than former generations.

The changes to how the questions are worded would move away from a?standard “agree/disagree” Likert scale. For example, rather than being asked to what extent they agreed with the sentence “the criteria used in marking have been clear in advance”, students would be asked “how clear were the marking criteria used to assess your work?” with a scale ranging from “very clear” to “not at all clear”.

Conor Ryan, director of external relations at the OfS, said the NSS needed to keep up with the changing nature of learning and teaching, adding that the proposed new survey “will help identify trends and provide a consistent measure of students’ academic experience”.

A consultation on the proposals runs for five weeks from 27?July, and Mr Ryan urged students and providers to get involved in the process “to ensure the questions we ask remain meaningful and reflect the most important aspects of higher education”.

“The NSS is a vital tool that informs regulation and providers’ decision-making. This review will ensure it continues to stand the test of time,” he added.

tom.williams@timeshighereducation.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
<榴莲视频 class="pane-title"> 相关文章
<榴莲视频 class="pane-title"> Reader's comments (5)
How relevant is the NSS anyway? Students are regularly surveyed at a module level to get feedback that can be assessed and actually utilised in improving each module, but this very 'high level' survey is well-nigh useless to the average academic as they work on next year's teaching materials... apart from being asked to sit through dull meetings analysing the NSS to very little effect.
The issue is that students are not encouraged within the survey to quantify their answers. They should be asked to explain their responses so that academics can use the feedback more meaningfully to shape and improve delivery. Too many of the questions are subjective. For example, when asked about feedback and assessment students often base their answers on how well they did or didn't do. I worry that a question about free speech will similarly be misinterpreted.
I didn't realise Toby Young got the job in the end.
Analysing NSS results is not only pointless because it is not verifiable for its veracity (i.e., students lie, misinterpret, forget) but also the differences in NSS points are often within measurement/sampling error. NSS is a useless exercise that only serves to create unwarranted and often harmful changes to educational policies in HE.
Any survey such as NSS give respondents a voice, and that’s never a bad thing. But in general, NSS serves little purpose in its current form. The free speech item is simply about fomenting culture wars, through a different means. The survey period distracts students at a critical time, when their attention lies elsewhere. Furthermore, it takes students a year or two out of university to accurately reflect upon the value, or otherwise of their university experiences when they’re using the knowledge and skills they gained in the ‘real’ world. It’s time for a re-think to truly capture the depth and breadth of university life. We often use mid-semester evaluations, they usually generate high response rates, tend to show less polarised views and generate meaningful answers we can use to improve the student experience more rapidly. A continuous performance improvement approach is a better alternative to improving the quality of students’ university experience. I have found myself in the past change course mid-module as a result of regularly capturing the students’ views with more success.