The Secret Teacher: ‘Our State exams are far too open to tweaking and manipulation’

Students deserve total transparency: what they achieve should be the result they get

Shane burst into class late and with no sign of an apology. He was too full of the joys of having passed his driving test.

“Next stop, the Leaving Cert,” he declared, rubbing his hands together. I smiled inwardly at his naivety.

When Shane sat the driver theory test he knew precisely how many questions he needed to answer correctly in order to succeed.

They were randomly selected from a bank of questions, and all equally weighted. For the practical exam, weightings are accorded to the different aspects of safe driving, and within each aspect there are different grades of fault, corresponding to how serious and how dangerous the infringement. There is, therefore, total transparency regarding how many faults of each type are permitted before a candidate is deemed “unpassable”.

READ MORE

A bonus for doing well at Leaving Cert French is that I can help my niece with hers, and she did the final "old" Junior Cert French exam last June. Section I of the listening test involves listening to three short and relatively simple conversations. Five different options are provided and candidates must select from them to identify the context of each short piece. This task is one of five in a test worth a total of 140 marks.

In 2016, that three-part question was awarded just 10 marks in total; the breakdown of marks was given as 4,3,3.

In 2010, the total of the three parts was 32 – in excess of three times more.

The approach in 2008 was to award 18 marks in total, but 12 marks were to be awarded for getting one element correct, 15 marks for getting two and the full complement of 18 for getting all three correct.

The tasks vary little but what is worth three marks in one year is worth four times more in another.

Recruitment struggle for correctors

There is a recruitment struggle for correctors, and we know that high turnover of staff indicates dissatisfaction. Those recruited know the remuneration package and the July timeframe in advance, so it’s clearly what emerges ‘on the job’ that turns them away.

To turn teachers away from an opportunity to see the fruits of their efforts is quite an achievement. Any examiner who has marked a paper diligently and conscientiously will always be extremely reluctant to downgrade the mark on instruction from “above”.

Setting assessment tasks is a serious business, and best practice involves an exam candidate having an insight into the success criteria, ie what success looks like.

Currently, exams set by the State Examinations Commission (SEC) are far too open to tweaking and manipulation, ensuring that the candidate doesn't know the marking scheme while sitting the exam, and that the examiner launches into the corrections knowing that revisions are a near certainty.

The SEC’s failure to have teachers vying for the role of corrector is critical here. Imagine how effective our school-leaving assessment would be if there were greater and more efficient correlation between those doing the groundwork in the classrooms and those receiving the exam scripts for correction in the summer.

Appropriate training

It really is as simple as investment in the appropriate training in the relevant skillsets known as up-skilling, and in many countries and lines of work it is used to better the product and the subsequent yield. The Irish education system urgently requires this for both those setting and correcting high-stake exams.

Perhaps the crisis isn’t acute enough: there will always be enough people who need the money so the job will always get done. Still, if beggars can’t be choosers, what guarantees have we of the actual standard of work of those recruited when the SEC becomes desperate for correctors?

Logically, when there aren’t enough people available for work, those who are appointed must do more. Imagine knowing the individual charged with grading your performance in an exam you have spent two years preparing for is an overworked, last-resort recruit? None of this is intended to offend those entirely capable of carrying out the task to a high standard – the process itself is flawed, and while it remains so it is doing a disservice to all correctors.

Daring to dream big, I wonder if we could perhaps also train people to simultaneously set exams and marking schemes which correspond to the appropriate standard?

I genuinely dream of the day when candidates sit before papers which reveal precisely how low or high the stakes are for every single question and sub-question. Imagine what a valuable resource a set of exams of a consistent standard could be.

Make a mockery

Right now, SEC past papers make a mockery of the content learned; what is a student to gain from discovering that knowledge and skills acquired vary in their value from year to year? What currency fluctuates in that way? Stable currencies, such as the Road Safety Authority (RSA), are entirely dependable.

Shane didn’t have to wait to find out if his poor positioning on the road was a minor or major fault. He knew going in that an inability to perform a hill start would count heavily against him. His tester didn’t need to have a conference call with the other RSA test centres to work out quotas for success and failure which each centre would then strive to meet.

Put simply, Shane’s performance in his driving test wasn’t subject to the demands of any bell curve. With the RSA, it truly is a case of total transparency: what you come and achieve on the day is precisely the result you get. And so, just as he had with the theory test, Shane walked away immediately after the exam knowing whether he had passed or failed, and precisely why. I wish him the best of luck with his Leaving Cert.