A project to measure the learning gain of students at English universities using standardised tests has been scrapped after it failed to persuade enough students to participate.
The early closure of the National Mixed Methodology Learning Gain project by the Office for Students is the latest in a series of global attempts to use a cross-disciplinary exam to measure undergraduate progress?that have failed to get off the ground.
First announced in 2015, the NMMLG formed a key part of the now-defunct Higher Education Funding Council for England¡¯s response to concerns that degree classifications were an insufficient indicator of learners¡¯ progress.
It was hoped that 27,000 undergraduates at nine universities would participate in the pilot, answering questions?that tested their critical thinking and problem-solving skills in an online multiple-choice exam at four stages during their university career. The results would have been used alongside survey results to measure students¡¯ development during their time in higher education.
ÁñÁ«ÊÓƵ
But the project soon ran into problems with student recruitment and the OfS after taking it on from Hefce. The low response rate meant that there would not have been enough data for ¡°meaningful and robust¡± analysis, the regulator said.
The closure of the NMMLG follows the collapse in 2015 of the Assessment of Higher Education Learning Outcomes (Ahelo) project, the Organisation for Economic Cooperation and Development¡¯s bid to compare student achievement globally using a standardised test. Key sectors including the UK refused to take part.
ÁñÁ«ÊÓƵ
A similar European Union project, known as Measuring and Comparing Achievements of Learning Outcomes in Higher Education in Europe, or Calohee, is yet to have a significant impact.
And Brazil¡¯s attempts to use standardised tests to measure learning gain have been plagued by low student participation. One study revealed that 118,000 students had skipped a single exam and, of the 469,000 who did take it, one in 10 students did not answer any multiple-choice questions, and nearly one in three did not attempt the written questions.
The OfS is now expected to focus its efforts on , involving 70 higher education institutions,?that are using tests, surveys, analysis of grades and qualitative methods to try to measure learning gain.
Camille Kandiko Howson, senior lecturer in higher education at King¡¯s College London, who will carry out an evaluation of all the learning gain pilots, told?Times Higher Education?that national tests were unlikely to provide an accurate picture of learning gain.
¡°There are such differences across subjects and institutional types that I don¡¯t think we¡¯d ever get something that would be useful to compare in that way,¡± she said. ¡°There is a future for learning gain, but at a more nuanced level.¡±
ÁñÁ«ÊÓƵ
It was hoped that the results of a nationally administered standardised test could have been used by students to demonstrate their ability to employers, and by institutions to identify which teaching methods work best.
But the NMMLG might have been hampered by concerns that test results could have been used as a metric in the teaching excellence framework, and hence as a means of increasing university tuition fees. Similar concerns triggered a boycott of the National Student Survey at many English campuses in 2017.
ÁñÁ«ÊÓƵ
Dr Kandiko Howson said that learning gain assessments needed ¡°sufficient buy-in from front-line staff to encourage students and explain to them the importance of taking part in learning gain studies¡±.
¡°The NMMLG project showed that it¡¯s not really feasible to do it as a centralised method, sending students some emails and thinking that can get the job done,¡± she said.
Sonia Ilie, a senior research fellow at the University of Cambridge¡¯s Faculty of Education, which is running one of the other learning gain pilots, agreed that it was critical to make it clear to students the rationale for the study. ¡°Learning gain assessments work best when they are embedded in the curriculum, when there is a clear sense of why and how it benefits students¡¯ learning,¡± she said.
Dr Ilie¡¯s project, which also uses a mix of survey and test questions but can be completed online in just over 20 minutes, has so far indicated high levels of reliability but ¡°scalability is always an issue¡±, she added.
Hamish Coates, a professor in Tsinghua University¡¯s Institute of Education who led the OECD¡¯s feasibility study on Ahelo, said that the failure to get the NMMLG off the ground wasn¡¯t a surprise.
Researchers were using outdated methodological approaches such as surveys and questionnaires, when ¡°every human being is now walking around with a phone in their pocket collecting billions of data points¡±, he argued.
ÁñÁ«ÊÓƵ
Measuring learning gain needs ¡°a coordinated political and leadership approach¡±, according to Professor Coates. ¡°It needs universities to line up with government and students to say we¡¯re not happy with the fact that people are graduating and we don¡¯t know what they¡¯ve learned,¡± he said.
Print headline:?Standardised tests for learning gain fail to cut it
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±á·¡¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login