IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

UK police fails ethical tests with "unlawful" facial recognition deployments

A University of Cambridge team audited UK police use of the tech and found frequent ethical and legal shortcomings

A male police officer watches a number of CCTV feeds on a computer monitor

A team of researchers from the University of Cambridge have called for UK police to be banned from using facial recognition in public spaces, and claimed that police deployment of the tech to date has not met “minimum ethical and legal standards”.

Researchers analysed three instances of facial recognition technology (FRT) used by UK police - two from South Wales Police and one from the Metropolitan Police Service (MPS) - and found that in every case FRT could infringe upon human rights, falling short of ethical and legal standards. 

Related Resource

Database and big data security

KuppingerCole 2021 Leadership Compass Report

Whitepaper cover with black header image with logo and title, and contributors photoFree Download

A group from the Minderoo Centre for Technology and Democracy created an audit tool to weigh the examples against a number of legal standards, as well as measurements of technical reliability, human decision-making and expertise.

Legal standards included the Data Protection Act (DPA) 2018, Human Rights Act 1998, and Equality Act 2010.

In response, the authors of the study have recommended that others apply its audit to other instances of police FRT. This could then build evidence to inform potential litigation efforts and lend its support to the prevention of police FRT use in public spaces.

“There is a lack of robust redress mechanisms for individuals and communities harmed by police deployments of the technology,” said the lead author Evani Radiya-Dixit, who is a visiting fellow at Cambridge’s Minderoo Centre.

“To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology.”

The report also concluded that UK police have consistently refrained from consulting the public, and marginalised communities in particular about the use of FRT, nor published transparent data on the use of FRT to allow for independent scrutiny.

Use of live facial recognition (LFR) by the MPS between August 2016 and February 2019, one of the three case studies assessed by the authors, was highlighted as a particular example of this lack of transparency.

"While MPS published some demographic data in their results, they did not record the demographic breakdown for engagements, stop and searches, arrests, and other outcomes resulting from the use of LFR,” read the report.

“This makes it hard to evaluate whether LFR perpetuates racial profiling. There was also no published evaluation of racial or gender bias in the technology. MPS conducted an internal evaluation but did not disclose the results. This lack of transparency makes it difficult for outside stakeholders to assess the comprehensiveness of the evaluation."

The report’s authors cited research that has shown FRT to perform noticeably worse on marginalised groups. They stated that under the Public Sector Equality Duty of the Equality Act 2010, police are required to acknowledge that live facial recognition carries bias against people of colour, women, and people with disabilities.

"We would have welcomed the opportunity to have spoken to the researchers of this report and to share with them our latest data, legal framework and the arrests made as a result of live facial recognition operations," an MPS spokesperson told IT Pro.

"Between August 2016 and February 2019, we trialled LFR in different situations, to allow us to learn and progress / develop its use before launching it in January 2020, with publicly available documents covering legal, data and equality issues on our website.

"We continue to review and assess our use of this important tool, including its demographic performance."

Proportionality was also raised as an issue, with South Wales Police found to have retained custody images on a facial recognition watchlist without clear limits on the seriousness of the offences of those included.

The authors argued that this was used disproportionately in combination with operator-initiated facial recognition (typically carried out on a mobile phone) and that the watchlist also included unlawfully-retained images of innocent individuals who were arrested but never convicted.

South Wales Police told IT Pro that a Court of Appeal judgement in 2020 highlighted some areas that needed improvement, and the force has begun trialling experimental deployments to test the proposed mitigations. The results of these trials will be known later this year, a spokesperson said.

“The legal challenge to our use of this ground-breaking technology through the courts has enabled us to ensure that its use is responsible, proportionate and fair,” said assistant chief constable Mark Travis.

“The whole aim of using facial recognition technology is to keep the public safe and assist us in identifying serious offenders in order to protect our communities from individuals who pose significant risks.

“I believe the public will continue to support our use of all the available methods and technology to keep them safe, providing what we do is legitimate and proportionate.”

Police use of facial recognition has become a commonly raised issue among rights campaigners, with the use of live facial recognition (LFR) particularly controversial in its deployment.

The Ada Lovelace institute published a report in June, which called for new regulations on biometric data processing in the public and private sector, and recommended the creation of an independent Biometrics Ethics Board to whom public bodies would have to justify their use of biometrics technology.  

Private companies have also prompted complaints from within government for their use of facial recognition tech. In July, a cross-party group of MPs signed a letter calling for a ban on two prominent Chinese CCTV firms’ use of facial recognition, stating that the widespread use of their equipment in public buildings constituted a national security risk.

This article was updated to include comments from the Metropolitan Police Service and South Wales Police.

Featured Resources

2022 State of the multi-cloud report

What are the biggest multi-cloud motivations for decision-makers, and what are the leading challenges

Free Download

The Total Economic Impact™ of IBM robotic process automation

Cost savings and business benefits enabled by robotic process automation

Free Download

Multi-cloud data integration for data leaders

A holistic data-fabric approach to multi-cloud integration

Free Download

MLOps and trustworthy AI for data leaders

A data fabric approach to MLOps and trustworthy AI

Free Download

Recommended

Met Police faces legal action over "racist" Gangs Matrix database
Policy & legislation

Met Police faces legal action over "racist" Gangs Matrix database

2 Feb 2022

Most Popular

Empowering employees to truly work anywhere
Sponsored

Empowering employees to truly work anywhere

22 Nov 2022
Salesforce co-CEO Bret Taylor resigns with cryptic parting message
Business operations

Salesforce co-CEO Bret Taylor resigns with cryptic parting message

1 Dec 2022
The top 12 password-cracking techniques used by hackers
Security

The top 12 password-cracking techniques used by hackers

14 Nov 2022