Apple Withdraws Controversial Photo Scanning Plans


Last August, Apple detailed several new forms intended to stop the spread of child sexual abuse material. The backlash from cryptographers to privacy advocates Edward snowden himself almost immediately, much tied to Apple’s decision not only to scan photos to iCloud for CSAM, but also to check if there are matches on your iPhone or iPad. After weeks of constant screaming, Apple stood up. At least so far.

“Last month we announced plans for features intended to help protect children from predators who use communication devices to obtain and exploit them, and limit the spread of Child Sexual Abuse Material,” it said. in a company statement on Friday. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to spend more time in the coming months to collect input and improve before releasing key safety features to child. “

Apple has not provided further instruction on what form the improvements will take, or how the input process will work. But privacy advocates and security researchers are cautiously optimistic about the stop.

“I think it’s a smart move by Apple,” said Alex Stamos, former Facebook chief security officer and cofounder of cybersecurity consulting firm Krebs Stamos Group. “There’s an incredibly complex set of trade-offs involved in this problem and it’s unlikely that Apple will seek a very sophisticated solution without listening to different equities.”

CSAM scanners work by generating cryptographic “hashes” of known abusive images — a type of digital signature — and then resisting large amounts of data for matches. Many companies have already created a form of this, including Apple for iCloud Mail. But with its plans to extend that scanning of photos to iCloud, the company suggests taking the extra step of checking your device’s hashes, too, if you have an iCloud account.

Introducing that ability to compare images on your phone against a set of known CSAM hashes-provided by the National Center for Missing and Exploited Children-immediately raises concerns that the tool can be used on another day. “Apple would have sent the phone to every single part of the CSAM scan that governments could have, and transferred to a surveillance tool so that Apple could also search people’s phones for other material, “said Riana Pfefferkorn, a research scholar at the Stanford Internet Observatory.

Apple has opposed several requests by the U.S. government to create a tool that would allow law enforcement to unlock and decrypt iOS devices in the past. But the company has it too made concessions in countries like China, where customer data resides on state-owned servers. At a time when lawmakers around the world are rushing efforts to crack down on encryption more widely, the introduction of the CSAM tool feels even more overwhelming.

“They clearly feel it’s a political challenge, which I think shows how unsustainable the position that‘ Apple will always refuse government pressure, ’” said Johns Hopkins University cryptographer Matthew Green. . “If they feel they need to scan, they need to scan unencrypted files on their servers,” which is the usual practice for other companies, such as Facebook, that regularly scanned not only by CSAM but also including terrorist and other unauthorized types of content. Green also suggested that Apple should create iCloud storage end-to-end encrypted, so that it can’t look at the images even if you want to.

The controversy over Apple’s plans is also technical. Hashing algorithms can generate false positives, incorrectly recognizing two images as pairs even if they are not. Called “collisions,” especially the part of the errors in the CSAM context. Shortly after Apple’s announcement, researchers began looking for crashes of the iOS “NeuralHash” algorithm intended to be used by Apple. Apple said at the time that the version of NeuralHash used to study was not exactly the same as the one used in the scheme, and that the system was accurate. Collisions can also have no material impact in practice, said Paul Walsh, founder and CEO of security firm MetaCert, given that Apple’s system requires 30 identical hashes before any can be sounded. alarms, after which the people reviewing can tell what CSAM is and what is false positive.



Source link

admin

Leave a Reply

Your email address will not be published. Required fields are marked *