Apple’s privacy mythology doesn’t match reality


In 2021, Apple introduced himself as the global privacy superhero. Its direction insists “Confidentiality has been at the heart of our work … from the very beginning” and that it is a “fundamental human right. It’s new The advertisement even boasts that privacy and iPhone are the same thing. Last spring, the rollout of a software update (iOS 14.5) that allows users to say no to apps monitoring their internet activity demonstrated something important: People choose privacy when they don’t. not struggle to control their information. Only now 25 percent of users consent, but previously nearly 75% consented by omission to their information being used in targeted advertising. As Apple plans to add more privacy protections in iOS15, which will be released next month, it continues to present itself as a force potentially capable of to slow down declining growth on Facebook, a paragon of surveillance capitalism. Unfortunately, Apple’s privacy promises don’t paint the full picture.

Perhaps the company’s most alarming privacy flaw is also one of the most profitable: iCloud. For years, the cloud-based storage service has still anchored hundreds of millions of Apple customers in its ecosystem, an Internet extension of your hard drive designed to effortlessly offload photos, movies, and other files to your computer. invisible backup drive. Unfortunately, iCloud gives the police almost as easy access to all of these files.

In the past, Apple has insisted it won’t weaken the security of its own devices to incorporate a backdoor. But with older devices, the gate is already built. According to Apple’s Law Enforcement Manual, Anyone using iOS 7 or earlier is out of luck if they fall in the crosshairs of the police or the ICE. With a simple warrant, Apple will unlock a phone. It might sound normal in Silicon Valley, but most tech giant CEOs have never claimed that the warrants for their devices endanger “the data security of hundreds of millions of law-abiding people … dangerous precedent that threatens everyone’s civil liberties. “This service is available due to security vulnerabilities that may have been addressed in later operating systems.

Since 2015, Apple has attracted the FBI and the Department of Justice anger for every new round of security improvements that create a device that is too secure for even Apple to decipher. But the dirty little secret with nearly all of Apple’s privacy promises is that there has always been a backdoor. Whether it’s iPhone data from Apple’s latest devices or the iMessage data the company has consistently championed as “End-to-end encrypted”, all this data is vulnerable while using iCloud.

Apple’s simple design choice to retain iCloud encryption keys created complex consequences. They don’t do this with your iPhone (despite government calls). They don’t do this with iMessage. Some of the benefits of making an exception for iCloud are clear. If Apple didn’t have the keys, account users who forgot their passwords wouldn’t stand a chance. Truly secure cloud storage would mean that the company itself would be no better able than a random attacker to reset your password. And yet retaining that power allows them to exercise the terrifying ability to hand over your entire iCloud backup when ordering.

ICloud data goes beyond photos and files and includes location data, such as “find my phone” or AirTags, Apple’s Controversial New Trackers. With just one court order, all of your Apple devices could be turned on you and become an armed surveillance system. Apple could fix it, of course. Many businesses have secure file sharing platforms. The Swiss firm treasure offers true “end-to-end encryption” for its cloud service. Tresorit users also see their files uploaded in real time to the cloud, synced across multiple devices. The difference is that the users, and not Tresorit, hold the encryption keys. This means that if the users forget their password, they also lose their files. But as long as providers have the power to recover or change passwords, they have the power to pass that information to the police.

The threat is only growing. As part of a new suite of content moderation tools, Apple will scan iCloud downloads and iMessage communications for material suspected of child sexual abuse. While the company once sought exclusively Pictures uploaded to iCloud for a suspected CSAM, the new tools can now transform any photo and text you’ve sent or received against you. Thwarting CSAM is a noble goal, but the consequences could be dire for those wrongly accused when AI fails. But even when the software works as expected, it can be fatal. As Harvard Law School professor Kendra Albert noted on Twitter, these “The characteristics will force gay children out of their homes, beat them up, or worse. “ Software launched in the name of “child safety” could be a deadly threat to LGBTQ + children exposed to homophobic and transphobic parents. Equally frightening, the tools used to track CSAM today can easily be trained to report political and religious content tomorrow.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *