DDE_Server

Level 12
Verified
Apple's senior director of global privacy has confirmed that the company scans photos uploaded to the iCloud for evidence of illegal activities such as child sexual abuse.
Jane Horvath made the admission while speaking at the Consumer Electronics Show (CES) 2020 conference in Las Vegas yesterday, according to The Telegraph.

While speaking at the tech conference, Horvath said that photographs that are backed up to Apple's online storage service are automatically screened for illicit content.
The company has been criticized by law enforcement agencies for allowing criminals to hide behind lanes of protective encryption and for refusing to break into the phones of suspected wrongdoers.

Addressing this issue yesterday in Las Vegas, Horvath said that giving criminals nowhere to hide by scrapping encryption was "not the way we’re solving these issues" but added: "We are utilizing some technologies to help screen for child sexual abuse material."

Exactly what technologies Apple is using to screen their customers' digital photographs and how long they have been doing so was not specified.

On the company's website it states: "Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space.

"As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation."

Companies including Facebook, Google, and Twitter check for images depicting the sexual abuse of minors with Microsoft’s PhotoDNA system. The system uses hashing technology to check images posted online against a database of previously identified photographs.

Paul Bischoff, privacy advocate at Comparitech.com, believes that Apple may be doing something similar.

"Here's what I think is happening: Apple has access to a law enforcement database of child abuse photos. Apple hashes or encrypts those photos with each user's security key (password) to create unique signatures. If the signatures of any encrypted photos uploaded from an iPhone match the signatures from the database, then the photo is flagged and presumably reported to authorities.

"This allows Apple to match photos uploaded to the cloud against the law enforcement database without ever breaking encryption or actually viewing the photos."

If this is the system that Apple is using, then Bischoff warns it has a serious flaw.
He said: "If a child abuse photo is cropped or edited, if it's converted to another type of image file, or if it's compressed, then the encrypted signatures won't match up."
 

Umbra

Level 26
Verified
So much for:

Privacy
"Privacy is a fundamental human right. At Apple, it’s also one of our core values. Your devices are important to so many parts of your life. What you share from those experiences, and who you share it with, should be up to you. We design Apple products to protect your privacy and give you control over your information. It’s not always easy. But that’s the kind of innovation we believe in."
 

Cortex

Level 16
Verified
I am strongly again child abuse as father of four, I'm not sure if scanning my photographs I upload is the job of Apple - Although cloud backup is ever more popular I'm a firm believer in using physical media(s) which I use for all things not Apple as maybe others others doing the same as Apple - Interesting how the abuse is maybe skillfully quoted as the reason but what else does the land of the free feel is illegal? i trust few people, that's apart from the Cat :)
 

Threadripper

Level 8
So much for:

Privacy
"Privacy is a fundamental human right. At Apple, it’s also one of our core values. Your devices are important to so many parts of your life. What you share from those experiences, and who you share it with, should be up to you. We design Apple products to protect your privacy and give you control over your information. It’s not always easy. But that’s the kind of innovation we believe in."
They're comparing hashes of photos to hashes of... some not very nice photos, if you use iCloud. It's hardly an invasion of privacy. In fact, this is the best way to help combat child abuse compared to breaking encryption and backdoors which governments are suggesting - this could become the standard, or a compromise, to stop governments gagging companies and complaining about encryption. So we can hope.
 

Umbra

Level 26
Verified
They're comparing hashes of photos to hashes of... some not very nice photos, if you use iCloud. It's hardly an invasion of privacy. In fact, this is the best way to help combat child abuse compared to breaking encryption and backdoors which governments are suggesting - this could become the standard, or a compromise, to stop governments gagging companies and complaining about encryption. So we can hope.
And it will be easy to deviate. It start with photos, and at the end, they may say they want to fight piracy so they will check your docs, audios/videos files.
Personally i don't mind much, i will probably never buy an IPhone.
 

Threadripper

Level 8
And it will be easy to deviate. It start with photos, and at the end, they may say they want to fight piracy so they will check your docs, audios/videos files.
Personally i don't mind much, i will probably never buy an IPhone.
Google and Microsoft use similar tech :p

Don't get me wrong, I understand the point you're making and it's a real concern, but this (as it stands) specifically isn't a cause of concern.
 

Arequire

Level 24
Verified
Content Creator
I fail to see why this is a problem. It's stated in the article:
The system uses hashing technology to check images posted online against a database of previously identified photographs.

Paul Bischoff, privacy advocate at Comparitech.com, believes that Apple may be doing something similar.

"This allows Apple to match photos uploaded to the cloud against the law enforcement database without ever breaking encryption or actually viewing the photos."
So unless you have literal child porn sitting in your iCloud, the impact on user privacy is insignificant.

If this is the system that Apple is using, then Bischoff warns it has a serious flaw.
He said: "If a child abuse photo is cropped or edited, if it's converted to another type of image file, or if it's compressed, then the encrypted signatures won't match up."
This line is rather ignorant though and shows he didn't research the fact that the system is designed to be resistant to image editing.
 
Last edited:

Threadripper

Level 8
I am really not surprised what's happening with Apple. They are doing many mistakes in their business. It's hard to understand how lots of people can think that they are reliable and trust them. For example, many users reported that apple is draining their batteries life. And what company did? Committed. Sad...
You clearly haven't read any of the replies in this thread, and the article you linked refers to a bug they patched in a few days. Why on earth would any phone company intentionally worsen the batteries of their brand new phones?