Facial recognition technology is "dangerously inaccurate"
Police stored innocent people's biometrics data after mistaking them for criminals, says privacy group
The Metropolitan Police's use of facial recognition is misidentifying innocent people as wanted criminals more than nine times out of 10, according to a privacy campaign group.
Civil liberties organisation Big Brother Watch published its findings into the Met's use of facial recognition technology in a report that it is set to present to Parliament later today.
The Met uses the technology to match people's faces against computer databases of criminals via CCTV and other cameras and they have deployed at numerous events, with very little success, according to the report, titled 'Face Off: The lawless growth of facial recognition in UK policing'.
It claims the Met has a failure rate of 98%, and during last year's Notting Hill Carnival, that police misidentified 95 people as criminals. The Met admitted that as a result of using facial recognition it has stored 102 innocent people's biometrics data for 30 days. Despite this, the force is planning seven more deployments this year.
"Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK. Members of the public could be tracked, located and identified, or misidentified, everywhere they go," said Silkie Carlo, director of Big Brother Watch.
"It is deeply disturbing and undemocratic that police are using technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms."
Facial recognition is also used by South Wales Police, but 91% of its system's matches were inaccurate, despite the Home Office providing 2.6 million in funding to use the technology.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
The South Wales police not only misidentified 2,400 innocent people with facial recognition, it also stored these people's biometric data, without their knowledge, for a year.
"This has wasted millions in public money and the cost to our civil liberties is to high. It must be dropped," added Carlo.
The campaign has been backed by a number of rights and race equality groups, such as Article 19, Index on Censorship, Liberty, Netpol and the Race Equality Foundation. Shadow home secretary Diane Abbott and shadow policing minister Louise Haigh will speak about the campaign in Parliament this afternoon.
IT Pro has approached both the Met and South Wales Police for comment.
Picture credit: Shutterstock
Bobby Hellard is ITPro's Reviews Editor and has worked on CloudPro and ChannelPro since 2018. In his time at ITPro, Bobby has covered stories for all the major technology companies, such as Apple, Microsoft, Amazon and Facebook, and regularly attends industry-leading events such as AWS Re:Invent and Google Cloud Next.
Bobby mainly covers hardware reviews, but you will also recognize him as the face of many of our video reviews of laptops and smartphones.
Forget the metaverse, AI is the money-spinner at Meta: Reality Labs lost $4.4 billion this quarter, but Mark Zuckerberg’s razor sharp AI focus is delivering results – and there’s more to come with Llama 4 on the horizon
Proofpoint acquires data security posture management firm Normalyze
Midnight Blizzard is on the rampage again, and enterprises should be wary of its new tactics