What Apple can do next to tackle child sexual abuse


In May 2019, Melissa Polinsky, director of Apple’s Global Investigation and Child Safety Team, faced investigators working on the UK Child Sex Abuse Inquiry. During two hours of interrogation, Polinsky admitted Apple employed just six people on its global team investigating child abuse images. Polinsky also said the technology used by Apple to search online for existing child abuse images was “effective.”

Fast forward two years, and Apple’s work to combat child pornography has derailed. On September 3, the company made a rare public turnaround by putting on hold plans to introduce a system that searches for known child sexual abuse content, or CSAM, on the iPhones and iPads of people in the States. -United. “We have decided to take more time over the next few months to gather feedback and make improvements before releasing these critically important child safety features,” Apple said in a statement, citing the “comments.” that he had received.

So what does Apple do next? The company is unlikely to win or please everyone with the following – and the fallout from its plans has created an almighty mess. The technical complexities of Apple’s proposals have reduced some public discussions to blunt statements, for or against, and explosive language has, in effect. some cases, polarized the debate. The fallout comes as the European Commission prepares child protection legislation that could force tech companies to seek CSAM.

“The moving [for Apple] doing some sort of content review was long overdue, ”said Victoria Baines, a cybersecurity expert who has worked for both Facebook and Europol on child safety investigations. Tech companies are required by US law to report any CSAM they find online to the National Center for Missing and Exploited Children (NCMEC), a US nonprofit for child safety, but Apple has historically taken the risk. lag behind its competitors.

In 2020, NCMEC received 21.7 million CSAM reports, from 16.9 million in 2019. Facebook tops the 2020 list, with 20.3 million reports last year. Google made 546,704; Drop box 20,928; Twitter 65,062, Microsoft 96,776; and Snapchat 144,095. Apple only made 265 CSAM reports to NCMEC in 2020.

There are several “logical” reasons for the discrepancies, Baines says. Not all tech companies are created equal. Facebook, for example, is all about sharing and connecting with new people. Apple’s main focus is on its hardware, and most people use the company’s services to communicate with people they already know. Or, to put it more bluntly, no one can search iMessage for children they can send sexually explicit messages to. Another issue involved here is detection. The number of reports a company sends to NCMEC can be based on the effort it puts in to find CSAM. Better detection tools can also mean that more abusive material is found. And some tech companies did more than others to uproot CSAM.

Detecting existing child sexual abuse content primarily involves scanning what people send or download, when that content reaches a company’s servers. Codes, called hashes, are generated for photos and videos, and are compared to existing hashes for previously identified child sexual abuse material. Hashlists are created by child welfare organizations, such as NCMEC and the UK Internet Watch Foundation. When a positive match is identified, technology companies can take action and also report the finding to NCMEC. Most often, the process is done via PhotoDNA, which was developed by Microsoft.

Apple’s plan to search for CSAM uploaded to iCloud reversed that approach and, using smart cryptography, moved some of the detection to people’s phones. (Apple scanned iCloud Mail for CSAM since 2019, but does not scan iCloud photos or iCloud backups.) The proposal proved controversial for several reasons.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *