IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

Apple shifts stance on CSAM scanning following widespread criticism

The tech giant will now only flag images that had been supplied by clearinghouses in multiple countries

Apple has provided further details concerning its child sexual abuse material (CSAM) scanning technology in its fourth follow-up briefing since its initial announcement ten days ago.

The tech giant will now only flag images that had been supplied by clearinghouses in multiple countries and not just by the US National Center for Missing and Exploited Children (NCMEC), as announced earlier.

In a change of stance, Apple also decided to publicly define a threshold for the number of CSAM images identified for law enforcement to be potentially alerted. The tech giant has announced that it will take 30 matches for the system to launch a human review which, if proven legitimate, will lead to authorities being notified about the presence of CSAM in a person’s iCloud library.

“We expect to choose an initial match threshold of 30 images,” Apple said in a Security Threat Model Review published late last week.

“Since this initial threshold contains a drastic safety margin reflecting a worst-case assumption about real-world performance, we may change the threshold after continued empirical evaluation of NeuralHash false positive rates – but the match threshold will never be lower than what is required to produce a one-in-one trillion false positive rate for any given account.”

Since Apple's initial announcement on 6 August, it has garnered substantial criticism from customers, privacy advocates, and even Apple employees.

Related Resource

Don’t just educate: Create cyber-safe behaviour

Designing effective security awareness and training programmes

How to define effective security awareness and training programmesDownload now

Last week, it was reported that the tech giant’s internal Slack channel had been flooded with more than 800 complaints about the technology, with many complaining that the move will sabotage Apple’s privacy-respecting reputation. Others have defended the tech, which ultimately aims to preserve the safety of minors and lead to the arrest of child sexual abuse offenders.

Privacy advocates have criticised the tech giant for deciding to roll out technology that could potentially be abused by authoritarian states to silence political opponents, journalists, and human rights campaigners. Apple responded by maintaining that the technology would not scan user’s iCloud uploads for anything other than CSAM, adding that it would reject governmental requests to "add non-CSAM images to the hash list".

Featured Resources

Accelerating AI modernisation with data infrastructure

Generate business value from your AI initiatives

Free Download

Recommendations for managing AI risks

Integrate your external AI tool findings into your broader security programs

Free Download

Modernise your legacy databases in the cloud

An introduction to cloud databases

Free Download

Powering through to innovation

IT agility drive digital transformation

Free Download

Recommended

Best free malware removal tools 2022
Security

Best free malware removal tools 2022

22 Jun 2022
A guide to cyber security certification and training
Careers & training

A guide to cyber security certification and training

16 Jun 2022
What is shoulder surfing?
social engineering

What is shoulder surfing?

10 Jun 2022
CIAM buyer’s guide
Whitepaper

CIAM buyer’s guide

6 Jun 2022

Most Popular

How to boot Windows 11 in Safe Mode
Microsoft Windows

How to boot Windows 11 in Safe Mode

7 Jun 2022
The top programming languages you need to learn for 2022
Careers & training

The top programming languages you need to learn for 2022

23 Jun 2022
Swift exit: How the world cut off Russian banks
finance

Swift exit: How the world cut off Russian banks

24 Jun 2022