Marking system is a learned curve

Since November, this year's exam students have been praised for displaying great stoicism in the face of the bitter ASTI dispute…

Since November, this year's exam students have been praised for displaying great stoicism in the face of the bitter ASTI dispute.

Robbed of valuable classroom hours and the calm atmosphere conducive to study, these 58,388 students have been variously described as "voiceless victims", "vulnerable teenagers" and "innocent bystanders" in a war between teachers and the Government.

These descriptions have been matched by heartfelt and almost unanimous sympathy among parents and the wider community. But sympathy is almost a valueless commodity in the mechanistic world of State exams.

When an examiner (technically known as an "assistant examiner") sits down to mark your paper in a few weeks, he or she may be filled with compassion towards students like you who have suffered, but this will count for little when that person comes to mark your paper.

READ MORE

This is because greater forces are at work than the efforts of one examiner.

The Leaving and Junior Cert exams have some basic assumptions behind them.

One is they are standardised forms of assessment with relatively rigid marking schemes. This means there are rarely wild fluctuations in the level of marks awarded.

So you may feel yourself and your friends are special cases, but the numbers are against you. The marking of most subjects follows a predictable pattern each year, which incidentally may be no bad thing.

If the grades are allocated this year in roughly the same fashion as others, it means this year's cohort will not be disadvantaged.

However, it is important to remember than within the class of 2001 certain students - like those in TUI-controlled schools or those in privately run grind schools - may still have the edge over those who lost days because of the strike.

But leaving such intangibles aside, if you examine the grades awarded in most Leaving Cert subjects every year a familiar curved pattern emerges.

The line starts out low and flattened for A and B grades, rises in the middle for C and D grades and then flattens out again for fail grades. While the figures involved do change each year, the curved pattern rarely becomes radically contorted.

Even in subjects with a different pattern to the norm - like higher-level Irish, in which most students get an honour - the year-on-year distribution pattern alters only marginally.

It is astonishing how little the pattern changes in some subjects. Leaving Cert results for 2000 and 1999 illustrate this.

Looking through popular subjects, one can seen little change in the number of honours grades awarded. For example, 66.7 per cent gained an honour in higher-level English in 1999, compared with 65.1 per cent last year - a tiny difference of 1.6 per cent.

Last year 65.9 per cent gained an honour in ordinary-level maths, compared to 67.6 the year before - a difference of 1.7 per cent. In higher-level Irish last year, 80.1 per cent picked up an honour, compared to 82.3 per cent the year before - a difference of 2.2 per cent.

It is clear from these figures that grade distribution changes little, at least in the short term, which is what this year's students care about.

In fact, in some subjects the patterns almost match perfectly. For example, in higher-level French 68.6 per cent gained an honours grade in the years 1999 and 2000. Essentially the only difference in the distribution pattern was that slightly more students failed in 1999.

A Department of Education source said the patterns do not alter much over five- or 10-year periods, although greater differences can be observed over longer durations. But that is of little use to this year's strike victims.

These curved distribution patterns are the key to consistency in the marking of State exams. They are the last line of defence against potential irregularities in the marking process.

Chief examiners, who are responsible for marking in each subject, get anxious when they see a sudden shift in the pattern in a particular year. Or at least they should do.

This apparently rigid system is based on the belief that the academic ability of students remains relatively constant, at least over short periods, thus there should be no sudden shifts in the grades.

More subtle changes over longer periods are a different matter and there is evidence to suggest that academic ability does - almost imperceptibly - change between the decades.

Before chief examiners come to study their charts, there are a raft of devices and arrangements which ensure that no group of students can buck the system, even if they have been the victims of a damaging strike. Each examiner going methodically through a batch of papers knows about the infamous curve in the subject. They know the grades they award should roughly match this pattern.

They also have to submit samples of their work to an "advising examiner" who checks to see they are marking fairly and according to the accepted standards.

The chief examiner, normally a Department inspector, is the final layer of security. He or she can also sample work, but is mainly there to ensure there is consistency in grade allocation.

The system however does contain some injustices. While the pattern of grades is normally similar within subjects (though not always), it can be very different between subjects.

Students are often criticised for their perception that some subjects are marked harder than others.

However, as recent reports by the National Council for Curriculum and Assessment (NCCA) on students grade distribution have shown, there is some substance to this view.

It under took a longitudinal study of Junior and Leaving Cert students and found wide discrepancies between the grades allocated in different subjects.

For example, in 1997 you were almost five times more likely to get an A1 in higher-level accounting than Irish.

The Points Commission report of 1999 asked the Department to study differences between subjects and a study is understood to be in the pipeline.