Apple backs off on controversial photo scan plans


In August, Apple detailed several new features intended to stop the dissemination of child pornography. The backlash from cryptographers to privacy advocates for Edward snowden itself was almost instantaneous, in large part linked to Apple’s decision not only to scan iCloud photos for CSAM, but also to check matches on your iPhone or iPad. After weeks of sustained outcry, Apple is pulling out. At least for now.

“Last month we announced plans for features to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse material. “the company said in a statement Friday. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take more time over the next few months to gather feedback and make improvements before releasing these features of. child safety of critical importance. “

Apple has not given more guidance on what form these improvements might take, or how this input process might work. But privacy advocates and security researchers are cautiously optimistic about the break.

“I think it’s a smart move from Apple,” said Alex Stamos, former director of security at Facebook and co-founder of cybersecurity consulting firm Krebs Stamos Group. “There is an incredibly complicated set of tradeoffs involved in this problem and it was highly unlikely that Apple would find an optimal solution without listening to a wide variety of actions.”

CSAM scanners work by generating cryptographic “hashes” of known abusive images – a kind of digital signature – and then combing through huge amounts of data for matches. Many companies are already doing this in one form or another, including Apple for iCloud Mail. But in its plans to extend this scan to iCloud photos, the company has offered to take the extra step of verifying these hashes on your device, too, if you have an iCloud account.

The introduction of this ability to compare images on your phone with a set of known CSAM hashes – provided by the National Center for Missing and Exploited Children – immediately raised concerns that the tool could one day be used for other purposes. . “Apple would have deployed a CSAM analysis function on everyone’s phone that governments could, and would like, turn into a monitoring tool to allow Apple to also search for other content on people’s phones,” explains Riana Pfefferkorn, researcher at the Stanford Internet Observatory.

Apple has resisted multiple requests from the U.S. government to create a tool that would allow law enforcement to unlock and decrypt iOS devices in the past. But the company also make concessions to countries like China, where customer data resides on state-owned servers. At a time when lawmakers around the world have stepped up efforts to undermine encryption more broadly, the introduction of the CSAM tool has proved particularly burdensome.

“They clearly feel this is a political challenge, which I think shows how untenable their ‘Apple will always refuse government pressure’ position is,” said Matthew Green, Johns University cryptographer. Hopkins. “If they think they need to scan, they need to scan the unencrypted files on their servers,” which is standard practice for other companies, like Facebook, which regularly check not only CSAM but also types. terrorist and unauthorized content. Green also suggests that Apple should do iCloud storage end-to-end encrypted, so that he could not see these images even if he wanted to.

The controversy surrounding Apple’s plans was also technical. Hash algorithms can generate false positives, mistakenly identifying two images as matches even when they are not. Called “collisions”, these errors are of particular concern in the context of CSAM. Shortly after Apple’s announcement, researchers began to find collisions in the iOS “NeuralHash” algorithm that Apple intended to use. Apple said at the time that the version of NeuralHash available for the study was not exactly the same that would be used in the program and that the system was accurate. Collisions may also not have a material impact in practice, says Paul Walsh, founder and CEO of security firm MetaCert, given that Apple’s system requires 30 corresponding hashes before triggering alarms, after which the Human examiners would be able to tell what is CSAM and what is a false positive.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *