Faults with State exams marking need to be corrected

Convoluted process of altering marking schemes may disadvantage some students

Sorting and despatching  Leaving Cert exam scripts. An internal research report shows it is possible  some  exam candidates candidates are disadvantaged during the marking process. Photograph: Alan Betson

Sorting and despatching Leaving Cert exam scripts. An internal research report shows it is possible some exam candidates candidates are disadvantaged during the marking process. Photograph: Alan Betson

 

The State Examinations Commission (SEC) deserves great credit for the efficient way in which the high stakes Leaving Cert examination process is managed each year.

Mistakes of any kind are rare and the procedures put in place to guarantee the anonymity of those taking and marking a set of standardised exams constitute a level of fairness that is difficult to replicate with most other forms of assessment.

A level of transparency is also achieved by publishing the final marking schemes and allowing candidates access to their marked scripts. These are key reasons why public support and confidence in the Leaving Cert process remains high.

Great care is taken during the exam construction process to ensure individual papers do not differ too much in terms of difficulty and style from those administered previously

That said, the procedures put in place to ensure consistency of standards from one year to the next are less well known and understood (the issue of consistency of standards across different subjects in any given Leaving Cert year is another story).

In the past, Freedom of Information requests about these procedures were denied by the Office of the Information Commissioner and as a result, the extent to which the final published marking schemes in any given year differ from the original versions is unknown.

Consistent

What is clear is that, for most subjects, the percentage of candidates achieving the different grade levels remains consistent from year to year although exam papers differ.

Given the reasonable assumption that cohorts of Leaving Cert students do not differ greatly in ability from year to year, maintaining this consistency in standards across exam papers (or standard setting) is achieved in two ways.

First, great care is taken during the exam construction process to ensure individual papers do not differ too much in terms of difficulty and style from those administered previously. (This “requirement” has led to some observations that the Leaving Cert is very predictable and discourages the inclusion of innovative questions that tap high-order thinking).

In an effort to mitigate the effect of a relatively hard paper, the marking scheme is altered and some of the more difficult questions are awarded less marks

Second, on occasions when particular subject exam papers are found to be too difficult or too easy following the initial marking of a sample of papers (for example unexpectedly low or high percentages achieving a H1), the original marking scheme is amended to ensure a “better” distribution of percentages across the grade categories.

Here is what it says on all final marking schemes published by the SEC:

“Marking schemes are working documents. While a draft marking scheme is prepared in advance of the examination, the scheme is not finalised until examiners have applied it to candidates’ work and the feedback from all examiners has been collated and considered in light of the full range of responses of candidates, the overall level of difficulty of the examination and the need to maintain consistency in standards from year to year. This published document contains the finalised scheme, as it was applied to all candidates’ work.”

Unusual

I am not sure how or why this practice began, but using a marking scheme in this way to ensure consistency of standards is unusual and Ireland is one of the few countries in the world where this happens.

Without access to SEC data, it is impossible to know how many marking schemes are altered each year and how many scripts are affected.

However, the internal SEC report published in The Irish Times this week shows it is possible that at least some Leaving Cert candidates are disadvantaged during the process.

For example, consider the following scenario. Following the marking of sample papers using the original marking scheme, student A gets 91 per cent, or a H1, and student B gets 89 per cent, or a H2.

In an effort to mitigate the effect of a relatively hard paper, the marking scheme is altered and some of the more difficult questions are awarded less marks.

This new marking scheme results in student A getting 89 per cent (H2) and student B getting 90 per cent (H1).

In other words, the rank ordering has changed and the grades awarded to the students have changed. Interestingly, neither student will know this has happened even if they view their scripts, as only the final marking scheme counts.

The standard-setting process used for the Leaving Cert is convoluted, laborious and expensive.

Enormous pressure

It not only puts enormous pressure on the SEC to get the initial marking of sample papers done quickly (and even more so now in light of the faster turnaround following the Rebecca Carter case), but intrudes on the valuable time available to ensure markers are trained to work accurately.

The suspicion that it may be undermining the validity of what a mark on an exam means and disadvantaging some students is worrying.

In a recent publication, an SEC official noted that there seemed to be little appetite among some key stakeholders for the use of more sophisticated standard-setting procedures.

This official is not alone in believing that while the current Leaving Cert procedures have served the Irish system reasonably well in the past, planned reform of the exam means that a more robust and transparent system needs to be put in place as a matter of urgency.

One solution would be to implement a mathematical transformation of the marks derived from the original marking scheme to scaled scores, once all marking is complete. This would be a simpler and more practical solution to ensuring consistency. More importantly, it may also be fairer to students.

Prof Michael O’Leary is director of the Centre for Assessment Research Policy and Practice at DCU’s Institute of Education.