Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life

oldschool

Level 85
Thread author
Verified
Top Poster
Well-known
Mar 29, 2018
7,618
Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system. Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.

To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security. There are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts—that is, accounts designated as owned by a minor—for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents. When Apple releases these “client-side scanning” functionalities, users of iCloud Photos, child users of iMessage, and anyone who talks to a minor through iMessage will have to carefully consider their privacy and security priorities in light of the changes, and possibly be unable to safely use what until this development is one of the preeminent encrypted messengers.

Apple Is Opening the Door to Broader Abuses

We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws. Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.

We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire.

Image Scanning on iCloud Photos: A Decrease in Privacy

Apple’s plan for scanning photos that get uploaded into iCloud Photos is similar in some ways to Microsoft’s PhotoDNA. The main product difference is that Apple’s scanning will happen on-device. The (unauditable) database of processed CSAM images will be distributed in the operating system (OS), the processed images transformed so that users cannot see what the image is, and matching done on those transformed images using private set intersection where the device will not know whether a match has been found. This means that when the features are rolled out, a version of the NCMEC CSAM database will be uploaded onto every single iPhone. The result of the matching will be sent up to Apple, but Apple can only tell that matches were found once a sufficient number of photos have matched a preset threshold. Once a certain number of photos are detected, the photos in question will be sent to human reviewers within Apple, who determine that the photos are in fact part of the CSAM database. If confirmed by the human reviewer, those photos will be sent to NCMEC, and the user’s account disabled. Again, the bottom line here is that whatever privacy and security aspects are in the technical details, all photos uploaded to iCloud will be scanned. Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement. Currently, although Apple holds the keys to view Photos stored in iCloud Photos, it does not scan these images. Civil liberties organizations have asked the company to remove its ability to do so. But Apple is choosing the opposite approach and giving itself more knowledge of users’ content.

Whatever Apple Calls It, It’s No Longer Secure Messaging

As a reminder, a secure messaging system is a system where no one but the user and their intended recipients can read the messages or otherwise analyze their contents to infer what they are talking about. Despite messages passing through a server, an end-to-end encrypted message will not allow the server to know the contents of a message. When that same server has a channel for revealing information about the contents of a significant portion of messages, that’s not end-to-end encryption. In this case, while Apple will never see the images sent or received by the user, it has still created the classifier that scans the images that would provide the notifications to the parent. Therefore, it would now be possible for Apple to add new training data to the classifier sent to users’ devices or send notifications to a wider audience, easily censoring and chilling speech.

But even without such expansions, this system will give parents who do not have the best interests of their children in mind one more way to monitor and control them, limiting the internet’s potential for expanding the world of those whose lives would otherwise be restricted. And because family sharing plans may be organized by abusive partners, it's not a stretch to imagine using this feature as a form of stalkerware. People have the right to communicate privately without backdoors or censorship, including when those people are minors. Apple should make the right decision: keep these backdoors off of users’ devices.
 
Last edited by a moderator:

The_King

Level 12
Verified
Top Poster
Well-known
Aug 2, 2020
549
Trillion-dollar companies can do what they want, not much is going to stop them at this point.

Let's all forget about the child labor incidents from their suppliers that made iPhones and praise them for protecting children now.
 
Last edited:

SpiderWeb

Level 13
Verified
Top Poster
Well-known
Aug 21, 2020
608
Honestly you should NEVER ever trust any company or any cloud storage service. Double encrypt. If it's in an American cloud server, if it travels over American fiber cables, the government can access it.

Btw Apple admits they have ALREADY been scanning your photos anyway lol
 

Gandalf_The_Grey

Level 83
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Apr 24, 2016
7,264
Also discussed here:
 

CyberTech

Level 44
Verified
Top Poster
Well-known
Nov 10, 2017
3,250

Apple scans iCloud Mail

However, Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task.

Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale. It did tell me that the “other data” does not include iCloud backups.

Although Friedman’s statement sounds definitive – like it’s based on hard data – it’s now looking likely that it wasn’t. It’s our understanding that the total number of reports Apple makes to CSAM each year is measured in the hundreds, meaning that email scanning would not provide any kind of evidence of a large-scale problem on Apple servers.
 

Gandalf_The_Grey

Level 83
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Apr 24, 2016
7,264
AdGuard Blog: Is there a price too high for children's safety? Our comment on Apple's new controversial feature
Why should people be worried about CSAM detection?
Let's sum up, the list won't be short.
  1. Possible mistakes of algorithms with devastating consequences to lives and careers.
  2. Software bugs. Don’t confuse with point one: it is considered normal for robots to make mistakes at this state of technological advancement. Actually, bugs are normal too, there is no software without them. But the price of a mistake varies. Bugs that lead to personal data leaks are usually among the most "expensive" ones.
  3. No transparency of the system (Apple is notorious for their reluctancy to disclose how their stuff works). Your only option is to trust that Apple's intentions are good and that they value your privacy high enough to protect it.
  4. Lack of faith. Why should we trust Apple after all their (and others') privacy flaws and crimes?
  5. Possible extrapolation of the technology to analyzing and detecting other types of data. Under the umbrella of child protection, a lot of opportunities for companies to dive into your information can be introduced.
  6. Abuse possibilities. Can an enemy or a hacker inseminate your iPhone with a certain picture that would match a picture from a certain set (so convenient there is a ready-made collection)?
This explains why we are pondering ways to give users control over the way Apple analyzes their photos. We ran a couple of polls in our social network accounts, the absolute majority of subscribers (about 85%) would like to be able to block CSAM scanning. Hard to believe that all these people plan to abuse children, they just see the risks.

We consider preventing uploading the safety voucher to iCloud and blocking CSAM detection within AdGuard DNS. How can it be done? It depends on the way CSAM detection is implemented, and before we understand it in details, we can promise nothing particular.

Who knows what this base can turn into if Apple starts cooperating with some third parties? The base goes in, the voucher goes out. Each of the processes can be obstructed, but right now we are not ready to claim which solution is better and whether it can be easily incorporated into AdGuard DNS. Research and testing are required.

Otherwise, the only way would be to block iCloud access. It's quite radical to do so for all AdGuard DNS users, but we can make it optional. The question rises, why would you not to just disable iCloud on your device? And you know what, with how things are going, we actually recommend considering this.
 

Gandalf_The_Grey

Level 83
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Apr 24, 2016
7,264
Apple backs down on CSAM features, postpones launch
Following widespread criticism, Apple has announced that it will not be launching its child protection features as planned, and will instead "take additional time" to consult.

"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," said Apple in a statement.

"Based on feedback from customers, advocacy groups, researchers and others," it continues, "we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
 

Gandalf_The_Grey

Level 83
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Apr 24, 2016
7,264
A more juicer title:
After calling users dumb, Apple decides to delay roll-out of phone spying feature
Apple has decided to reconsider its plan to scan the smartphones of all users on behalf of the police for material linked to child sexual abuse.

Critics including the EFF had expressed concern that such a system could easily be abused for other crimes, with the definition of a crime being very fungible depending on location.

In turn, Apple has accused users of not understanding the genius of their implementation and have asked users to trust them, with Apple’s Craig Federighi telling the Wall Street Journal:

“It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood. We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.”

It seems however the concerted resistance has had some effect, as Apple has now decided to delay the roll-out of the feature, with Apple telling 9to5Mac:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

It remains to be seen what changes Apple will come up with, but it seems likely that if it includes routinely scanning user’s private property without their consent critics will remain unsatisfied.
 

Digmor Crusher

Level 25
Verified
Top Poster
Well-known
Jan 27, 2018
1,413
They will wait until the iphone 13 comes out, sell a few million at ridiculous prices, then slap on the scanning implementation.
 
  • Like
Reactions: Nevi

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top