ICO neuters UK police use of facial recognition technology

Man using facial recognition on phone

Live facial recognition (LFR) deployments by police forces across the UK have been dealt a blow after the UK data regulator confirmed its use falls under the umbrella of data protection.

In the midst of ongoing rows between law enforcement agencies and digital rights campaigners, the Information Commissioner's Office (ICO) has clarified LFR "is a potential threat to privacy" and a "high priority area".

The data regulator has advised police forces using facial recognition to carry out a full data protection impact assessment (DPIA), which must then be subsequently updated for each deployment. This is due to the sensitive nature of the processing, the volume of people affected, and the intrusion that can arise.

Forces must then submit these assessments to the ICO for consideration prior to any discussions between the two parties as to how the privacy risks can be mitigated. Any violations will be adjudicated under the General Data Protection Regulation (GDPR) and the Data Protection Act (DPA) 2018.

"We understand the purpose is to catch criminals," the Information Commissioner Elizabeth Denham said in a blog post. "But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives.

"I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering the invasiveness of LFR."

A small number of police forces, including the Met Police and South Wales Police, have been trialling facial recognition in public spaces for more than a year. But the effectiveness of the software has come under heavy scrutiny, particularly after research found the failure rate can be as high as 98%.

Several reports alluding to high inaccuracy rates, as well as concerns about legality, led the ICO to launch a probeinto ongoing trials towards the end of last year. This investigation is ongoing and is also pending the outcome of a legal challenge against South Wales Police made in May.

"Legitimate aims have been identified for the use of LFR," Denham continued, adding the ICO has learned a lot from its deep dive into how LFR works in practice.

"But there remain significant privacy and data protection issues that must be addressed, and I remain deeply concerned about the rollout of this technology."

The ICO argues LFR significantly differs to CCTV, and that facial recognition systems haven't yet resolved potential instances of racial bias; where more false positive matches are generated from certain ethnic groups.

In addition to full DPIAs, police forces must produce a bespoke 'appropriate policy document' to set out why, where, when and how LFR is being used. This is in addition to ensuring the algorithms within the software to not treat the race or sex of individuals unfairly.

This aspect of deployments, however, may be beyond the reach of individual police forces given the software powering the technology is developed by a third-party vendor.

Use of such technology in the criminal justice system, particularly artificial intelligence (AI)and facial recognition technology, has come under fire for bias and discrimination in the UK, and in the US.

For example, according to one report, the Met Police's counterparts in New York were abusing facial recognition technology to arrest people when CCTV images were too unclear to identify suspects.

In extreme cases, New York Police Department (NYPD) officers took high-resolution pictures of a suspect's celebrity doppelgnger to generate a match with another license database.

Keumars Afifi-Sabet
Features Editor

Keumars Afifi-Sabet is a writer and editor that specialises in public sector, cyber security, and cloud computing. He first joined ITPro as a staff writer in April 2018 and eventually became its Features Editor. Although a regular contributor to other tech sites in the past, these days you will find Keumars on LiveScience, where he runs its Technology section.