In August, Apple announced its intention to implement a surveillance system in all its devices aimed at detecting child pornography, limiting its distribution and targeting sexual predators. The brand intended to “match” the images stored on them with the information stored in databases on child abuse. Despite the brand claiming that the system was designed “with user privacy in mind,” virtually all organizations working for digital privacy and numerous data protection experts jumped on it. That pressure has paid off.
Beyond Signal and Telegram: a beginner’s guide to alternative services to digital monopolies
In an update to its child protection policies published on Friday, Apple announced its intention to “delay” the application of the surveillance system indefinitely. “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time in the coming months to collect feedback and make improvements before launching these critically important child safety features.” explains the company.
Apple’s initial plan was to establish this revision of the images of the devices in the US, to later extend it to the rest of the world. The system was not going to review the images one by one, but rather an algorithm was going to analyze the “fingerprint” of those that were uploaded to iCloud, the company’s cloud, to compare it with the database of child pornography.
If that algorithm had detected signs of suspicion in the account of a certain user, Apple’s plan was for the system to raise an alert for a human operator to take action. This would confirm if the stored images correspond to pedophilia and notify the security forces if necessary.
The brand claimed that it had trained this algorithm to have a “very low” false positive rate. However, the experts raised two fundamental criticisms: on the one hand, the system meant opening a door to review the content of the phones that could later be used for other less well-intentioned purposes. On the other, criminals could bypass it as easily as by disabling image syncing with iCloud.
Finally, Apple has decided to take the criticism seriously and indefinitely postpone the application of these security measures.