Early in 2021, Apple announced a suite of features meant to protect children from being sexually exploited online. Some are on-device, like parental controls that prevent child accounts from seeing or sending sexual photos in Messages, but the most controversial measure was a system to scan photos as they were uploaded to iCloud.
The system was meant to protect privacy by only comparing unique hashes of images to see if they matched the unique hashes of known CSAM (Child Sexual Abuse Material). Still, it was roundly criticized by privacy advocates as a system that could be exploited, for example, by state actors forcing Apple to find images of dissidents. Some child safety experts also thought the system wasn’t robust enough, as it could only match images from a known database and not newly created CSAM.
Apple delayed that part of its child safety features, and then last December, confirmed that it had quietly killed the project. Instead, Apple said, the company would focus on safety features that run on-device and protect children from predators, rather that developing a system that scans iCloud images.
Now Apple finds itself defending that decision, reiterating its previous rationale.
A child safety group called Heat Initiative says that it is organizing a campaign to pressure Apple to “detect, report, and remove” child sexual abuse imagery from iCloud. Apple responded to this development in a statement to Wired. The company essentially made the same argument it did last December: CSAM is awful and must be combated, but scanning online photos creates systems that can be abused to violate the privacy of all users.
Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit…It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.
Erik Neuenschwander, Apple’s director of user privacy and child safety
In short, Apple is admitting (again) what the privacy advocate community said when the iCloud CSAM scanning feature was first announced: There’s just no way to make it work without also creating systems that can imperil the safety and privacy of everyone.
This is just the latest wrinkle in the age-old encryption debate. The only way to fully protect users’ privacy is to encrypt data in a way that nobody can “look into” it other than the user or their intended recipient. This protects the innocent and criminals alike, so it naturally is opposed by law enforcement groups, intelligence agencies, and other organizations who each have their own reasons for wanting to search through user data.
Apple believes that preventing CSAM and other forms of child abuse is critically important, but must be done in a way that does not allow Apple (or other groups) any way to view user data. On-device detection and hiding of nude imagery is one such feature that Apple has been expanding with OS updates over the last couple years.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.