Apple has confirmed it will hold off plans to start scanning iPhones for evidence of Child Sexual Abuse Material (CSAM). The decision comes after backlash from security researchers, privacy advocates and many Apple users.
Last month, the company announced plans to expand protections for children, giving itself the ability to scan user photos for evidence of Child Sexual Abuse Material (CSAM).
The technology, developed in collaboration with child safety experts, would let the company detect known CSAM material stored in users’ iCloud photo roll. The company planned to perform on-device matching using a database of known CSAM image hashes provided by the National Center for Missing and Exploited Children (NCMEC), and other child safety organizations.
Having already met backlash upon announcing the technology, Apple armed itself with two transparent technical papers that detailed CSAM detection thoroughly, and the company stressed that the feature provides significant privacy benefits over existing techniques. Its reasoning: a human reviewer will only look at a user’s actual photos if the user has a collection of known CSAM in their iCloud Photos account and a certain threshold is met in file hash comparison.
Advocacy groups and cybersecurity analysts attacked the decision, fearing it will give the government greater leeway to monitor users.
Apple recently gave a statement to several media outlets confirming it is holding off the program’s launch.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said.
“Based on feedback from customers, advocacy groups, researchers & others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple originally planned to deploy the feature as part of the iOS 15 launch this year.