The A-level results crisis has once again shown the limits of AI

Let’s be clear: artificial intelligence (AI) and its offspring are wonderful things. They can recognise patterns faster than humans can, meaning things like security threats that can be detected more quickly and accurately. They can be used to recognise hand gestures, which could have a positive impact on surgery, for example. In self-driving cars they can – theoretically at least – detect hazards and react to them quicker than humans.

But when it comes to making decisions about people’s lives, AI is often the worst possible tool to throw at the problem, as this year’s A-level results fiasco (and, undoubtedly, the GCSE results fiasco, when they’re released on 20 August) shows.

This isn’t news to anyone, either - or at least it shouldn’t be. Algorithms have repeatedly determined that black people, especially black men, are criminals. And that’s if they can even differentiate between one black person and another. Asian people, meanwhile, are terrorists, algorithms have determined. If you’re white, however, then please come this way, sir or madam, as you’re clearly virtuous thanks to your skin colour – or so a now shelved Home Office AI system determined.

While the algorithm being used to determine this year’s exam results isn’t racially biased, it’s defective in multiple other ways and is ruining people’s lives.

As revealed in tweets from BBC Newsnight’s policy editor, Lewis Goodall, the AI system that’s being used to decide students’ grades has penalised outstanding students based on the performance of past students, sometimes to an extraordinary extent. These stories include a student with predicted a B or C who the algorithm instead awarded a U because of an apparent faulty decision process. Another student, who was predicted AAA by his school and A*A*A* by UCAS, the algorithm gave BBB. And there are many, many more.

Of course, every year there are some students who are disappointed by their final grades, but this is not the same. This year’s students are effectively being penalised for not having taken the exam at all, which they were unable to do through no fault of their own.

As much as some people may declare that A-level, GCSE and Scottish Higher results don’t matter, the reality is that they do. They matter in the moment emotionally to the children who have taken them, and they matter because they are a stepping stone to the next stage of their education or to entering the workforce.

I can’t shake the feeling that the UK government, and governments elsewhere, have fallen in love with a technology they don’t fully understand. But as those blinded by love so often do, they’re sticking by the object of their affection, despite clear evidence that it’s fundamentally broken.

As I’ve said before, technology isn’t magic. In fact, it can plunge us into a dystopian world where our fate is determined by a faulty AI that pays no regard to our own character or achievements. Instead, a black box algorithm pulls in data about where we live, what school we went to, the colour of our skin.

There’s no sci-fi hero coming to save us from the misuse of technology, though – it’s up to us as a society to push back against it, before it’s too late.

Jane McCallion
Deputy Editor

Jane McCallion is ITPro's deputy editor, specializing in cloud computing, cyber security, data centers and enterprise IT infrastructure. Before becoming Deputy Editor, she held the role of Features Editor, managing a pool of freelance and internal writers, while continuing to specialise in enterprise IT infrastructure, and business strategy.

Prior to joining ITPro, Jane was a freelance business journalist writing as both Jane McCallion and Jane Bordenave for titles such as European CEO, World Finance, and Business Excellence Magazine.