ÁñÁ«ÊÓƵ

Assessors must blow their cover

<ÁñÁ«ÊÓƵ class="standfirst">
November 3, 1995

As HEFCE's first assessment is published Geoffrey Alderman (below) argues that stage two is shrouded in secrecy and Ian Howarth (right) suggests an alternative.

Whatever else may be said about the audit and assessment in-dustries, there can be no denying that their growth has stimulated an era of glasnost so far as the measurement of student performance is concerned. Not so long ago the proceedings of the degree classification board were a closely guarded secret. Examination candidates knew little if anything about the criteria against which they were assessed or even the process of assessment.

Today the situation is very different. Students will typically have a right to know the marks they obtained for each component of their degree course, the criteria which the examiners have used in arriving at those grades, and the way in which each grade has contributed to the final award.

In relation to the Government inspection of higher education institutions, through the quality assessment procedures authorised by the 1992 Further and Higher Education Act, it might be thought that universities and colleges, too, have the right to know how they are assessed, the criteria against which they are assessed, and the manner in which the final assessment outcome is arrived at. But this right, though conceded in principle, is in practice heavily curtailed.

ÁñÁ«ÊÓƵ

When the Higher Education Funding Council for England launched its original assessment methodology, involving an in-house evaluation of departmental self-assessments and (if they were submitted) claims for excellence, institutions were denied access to a scoring template used internally by the HEFCE to decide whether a claim for excellence had substance, or whether a self-assessment indicated provision "at risk". The final page of that template, on which the in-house assessor listed areas which ought to be scrutinised should an assessment visit take place, constituted an agenda, so to speak, for the visit itself. The final page was made available to the reporting assessor in charge of the visit, but not, apparently, to the visiting assessment team, and certainly not to the institution.

Earlier this year claims for excellence were abandoned, as the HEFCE ditched the old methodology in favour of "universal visiting", and replaced the qualitative judgements - unsatisfactory, satisfactory and excellent - by a system of "profiles" - each graded from one to four points - relating to six core areas of provision: curriculum; teaching; student progression; student support; learning resources; and quality assurance and enhancement. At the same time, much more detailed guidance was given on the drafting of self-assessments.

ÁñÁ«ÊÓƵ

Institutions were told that self-assessments would be evaluated in-house by the HEFCE, and that this written analysis would be used "for the purpose of setting the priorities for and commencing the planning of the visit". But the instructions to reporting assessors paint a rather different picture. The old template has been replaced by a new one, which states that it is intended "to provide the reporting assessor with a framework for the preliminary visit and, for the assessment visit proper " (my emphasis).

Indeed, to judge from the replies given to me by HEFCE-trained subject assessors, not even they - who make up the assessment teams - seem to be aware of the contents or purpose of the revised template. Why not? And why has the new template not been made available to institutions?

The Assessors' Handbook, published by the HEFCE last February, tells us that in reading a self-assessment, the members of an assessment team will judge its contents according to 14 criteria, such as "Are the teaching methods described?" But if, and only if, you are a HEFCE-trained assessor, you will know that in fact you are asked to use a 15th criterion, namely: "Is it likely that the broad aims stated by the institution are achievable, given the stated objectives, the description of the resources and the quality of the student intake?" This additional criterion appears in the training literature, but not in the handbook. Why not?

Under the old HEFCE assessment methodology, assessors were enjoined to give confidential feedback to teachers on classes they had observed. Additionally, they were advised that they should not volunteer to disclose to the teacher the grade awarded in respect of any class observation, though if the teacher insisted, the grade had to be revealed. Under the new methodology, assessors are being instructed to give the grade only "if requested" by the teacher. If a university told its students that their grades would only be revealed if this was specifically requested, the institution would be held up to ridicule - and rightly so. So why is this approach being adopted by the HEFCE?

ÁñÁ«ÊÓƵ

While institutions ponder these questions, they will no doubt subject to critical scrutiny the two editions of the handbook which the HEFCE has so far issued for use with the new methodology. In June a revised version of the February handbook was published. A covering letter from the HEFCE merely advised that the hypothetical assessment report included as an appendix "does not precisely conform to the guidelines set out earlier nor to the guidelines set out in HEFCE circular 39/94".

"Does not precisely conform"? In fact, there are substantial differences between the old and new versions, the most important relating to the aims and objectives of the sample report. In February, the HEFCE seemed clear that departmental aims and objectives comprised a general statement of goals. But by June - by which time institutions had already submitted their first self-assessments under the new methodology - this view had been radically altered. It now seems that "aims" means the aims of the programme, while "objectives" pertain to intended learning outcomes. If a university altered the aims and objectives of a degree programme in this manner, with the programme already under way, it would be convicted by its academic auditors and condemned by the first assessment team unfortunate enough to cross its threshold.

In September the HEFCE issued its forward programme of assessment visits, designed to ensure that every subject taught by the taxpayer-funded higher education sector will have been assessed by the end of December 200l. The Government has made it clear that it wants to see funding linked to assessment outcomes. It is important, therefore, not merely that the methodology is as right as we can get it, but that it is as transparent as possible. This means, I would suggest, not simply that the mechanics of the system are made very public, but that institutions are kept fully informed about the progress of the assessment cycle. This information is made available to members of the HEFCE's quality assessment committee. Why is it not more widely publicised?

From the outcomes of the first dozen assessment visits under the new methodology I am, I have to confess, heartened by the evident fact that assessors are by no means reluctant to give maximum points for particular areas of provision - except for quality assurance and enhancement, which it would appear can never be perfect. I can just about understand why the HEFCE is unwilling to identify individual institutions, but see no harm in information about assessment outcomes being periodically issued to the sector in an anonymised form. Why is the HEFCE so coy?

ÁñÁ«ÊÓƵ

lt may come as a surprise to some members of the HEFCE's quality assessment division to know that I am a fan of assessment. The taxpayer has a right to know what he or she is getting for his or her money. Besides, there can be no denying that assessment has concentrated the minds of university managers on the learning process in a dramatic fashion - and a good thing too.

But I do ask the HEFCE to apply to us the same utter transparency of process which they would expect us to offer our students. Is that really too much to ask?

ÁñÁ«ÊÓƵ

Geoffrey Alderman is head of the academic and quality assurance unit at Middlesex University.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs