Leaving Cert: Why the Government deserves an F for algorithms

Net Results: Invisible code has a significant – and often negative – impact on all our lives

The problem is much deeper than one-off grading scandals. Photograph: Bryan O’Brien

The problem is much deeper than one-off grading scandals. Photograph: Bryan O’Brien

 

In August, following the grading algorithm debacle in the UK, I wrote a column wondering if perhaps this unfortunate event might prove a critical tipping point for public trust in the almighty algorithm. 

This new generation in the UK, about to step into adulthood – and an active role as the UK’s future voters, employees, thinkers, campaigners and policymakers – was unlikely to forget what it was like to find themselves judged by biased computer code with invisible, baked-in assumptions, I thought. 

Well. Here we are a few months later, and any sense of grade assessment superiority we might have had on this side of the Irish Sea has shrivelled into a husk.

We too have algorithmed our way into a grading conflagration in which at least two coding errors meant more than 6,000 students received grades lower than they should have, and, in a bleakly unfunny inversion, nearly 8,000 more got grades that were higher than the algorithm, if working correctly, should have assigned. 

The Government has punted this problem over to the already-stretched and cash-strapped third-level institutions, which are somehow supposed to make extra spaces available as needed for the ones with unwarranted lower grades. 

It isn’t as yet clear what happens regarding the additional cohort of students who may have lost places they rightfully earned, to some portion of the 7,943 students who may have gained that place with incorrectly assessed grades. Fairly analysing and resolving this mess is a challenge for the Department of Education, the institutions involved and the students affected. 

In August I quoted experienced programmer and technology consultant Dermot Casey, who had expressed concern about what assessment factors might go into any proposed Leaving Cert algorithm here.

In last Saturday’s Irish Times, he revisited this issue, and wrote an informative explanatory piece on algorithms, offering a detailed list of questions that need to be asked now about how the Irish algorithm was coded and stress-tested. 

Larger concerns

As public money went into producing the algorithm, and so many people have been affected by it shortcomings, the Government must answer those questions. 

But this imbroglio is ultimately, pointing towards even larger concerns than one year of Leaving Cert and grading chaos.

The algorithm issue is huge. Algorithms affect your daily life. In microseconds, they make determinations and decisions about nearly every aspect of our existence, the social, political, health-related, and ethical, from our (supposed) suitability for mortgages and jobs, to the medical care we receive, the ads and posts we see on social media and the results returned when we do a search on Google.  

In nearly every case, we have absolutely no idea what determinations go into these algorithms. We do not know who coded them. We do not know how they work; how they judge. By their very nature – hidden lines of complex code, obscured by laws protecting business assets – they function invisibly. They are shielded as corporate proprietary information and “intellectual” property – even though it is our intellects that they lay claim to, judging us by the data they gather (typically, without us knowing). This data then, ludicrously, becomes their property, not ours. Whole, revealing chunks of us, some of it extremely revealing and sensitive, owned not by us, but by them. 

Algorithms have an impact on every one of us. But we only see the end-result decisions made by that code, not the human-originating decisions, assumptions or biases that underlie those coding decisions, in algorithms produced primarily by one small segment of human society – younger, primarily white men, often from elite universities. 

Not neutral

We know from many studies that algorithms are not neutral. We know that until recently, if you searched Google images for “Black women” or “Latino women”, the majority of returns were exploitative and pornographic in nature (the same happened with ethnic searches for “boys”). This bias has now been adjusted, demonstrating how easily an algorithm can be tweaked – in the few cases where obvious bias eventually can be seen. 

Unfortunately, many biases are so deeply imbedded that it takes experts to reveal them, if they get revealed at all, as in the case of, say, medical software that prioritised white patients over black patients with worse symptoms. Or in the case of an Amazon recruiting AI algorithm that down-ranked female candidates.

We must fully understand the fallibility of algorithms and demand better from those who produce them.

Coding or bias errors, in the Irish and UK grading algorithms this year, will have made these issues clearer, in a frustrating, painful, life-affecting way, to many. 

We can’t leave it at that, though. Our next step must be to push for laws that will require corporate algorithm transparency – as is beginning to happen now in the EU – because this isn’t just about one-off grading scandals.

This is about all of us, in thrall to obfuscating algorithms that judge us secretly and silently, with potentially life-changing impact, every single day.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
SUBSCRIBE
GO BACK
Error Image
The account details entered are not currently associated with an Irish Times subscription. Please subscribe to sign in to comment.
Comment Sign In

Forgot password?
The Irish Times Logo
Thank you
You should receive instructions for resetting your password. When you have reset your password, you can Sign In.
The Irish Times Logo
Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.
Screen Name Selection

Hello

Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
SUBSCRIBE
Forgot Password
Please enter your email address so we can send you a link to reset your password.

Sign In

Your Comments
We reserve the right to remove any content at any time from this Community, including without limitation if it violates the Community Standards. We ask that you report content that you in good faith believe violates the above rules by clicking the Flag link next to the offending comment or by filling out this form. New comments are only accepted for 3 days from the date of publication.