Governments planned to misuse CSAM scanning tech even before Apple’s announcement

CyberTech

Level 44
Thread author
Verified
Top Poster
Well-known
Nov 10, 2017
3,247
Governments were already discussing how to misuse CSAM scanning technology even before Apple announced its plans, say security researchers.

The biggest concern raised when Apple said it would scan iPhones for child sexual abuse materials (CSAM) is that there would be spec-creep, with governments insisting the company scan for other types of images, and there now seems good evidence for this …

Background
Apple insisted that it had a solid safeguard in place to protect privacy and prevent misuse. It would only match images against known CSAM databases; it would check at least two databases and require the image to be in both; action would only be triggered on 30 matching images; and there would be a manual review before law enforcement was alerted.

However, I and others were quick to point out that such promises are impossible to keep.
As the company has said in regard to previous controversies, Apple complies with local laws in each of the countries in which it operates. Indeed, it recently gave into blackmail before a new law even comes into force.
Any government could pass a law requiring tech companies to use their available capabilities (e.g., the CSAM scanning system) to look for images they say are associated with terrorism, or any type of political opposition.

The rest
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top