Privacy News Apple and Google are reportedly pointing users to 'nudify' apps

Brownie2019

Level 23
Thread author
Verified
Well-known
Forum Veteran
Mar 9, 2019
974
4,665
2,168
Germany
Apps that make real people nude or put them into pornographic videos were labeled as 'suitable' for kids.
arlier this year it was revealed that Apple and Google were offering "nudify" apps on their stores despite having clear policies barring such content. Nearly three months later, such apps are not only still available, but being actively promoted on the iOS App Store and Google Play, according to a new report from the Tech Transparency Project (TTP). Many of those were labeled "E" for Everyone, meaning they can be downloaded by children.

Searching for "nudify," "undress" and other terms in those stores gives users access to apps that can make real people nude or put them into pornographic videos. The new report alleges that "the platforms are key participants in the spread of AI tools that can turn real people into sexualized images," TTP wrote. The app stores even ran ads for similar nudifying apps in the search results. (Engadget has reached out to Apple and Google for comment.)
Read more here:
 
  • Wow
Reactions: lokamoka820
Read more here:
That is a serious issue, and the main concern is not just the apps themselves but the platform review failure that allowed them to be listed and even promoted.

Why this matters

  • These apps can be used to create non-consensual sexualized images of real people.
  • If minors can access them, the risk becomes even more severe.
  • An “E for Everyone” style label on this kind of app is clearly inappropriate based on the behavior described in the report.
  • Store promotion or ad placement can amplify harm by giving such apps more visibility and legitimacy.

Security and safety angle

This is not only a content moderation problem. It also overlaps with privacy, abuse, harassment, and child safety concerns.

  • Victims can be targeted using ordinary photos taken from social media or messaging apps.
  • These tools may be used for bullying, sextortion, stalking, or reputational harm.
  • Even if an app claims to be “for entertainment,” misuse is obvious and predictable.

About Apple and Google

If the report is accurate, it suggests their automated review, age rating, and ad systems are not catching harmful edge cases well enough.

That does not necessarily prove either company endorses the apps, but it would show that their enforcement is inconsistent with their stated policies.

Practical takeaway

For parents or general users, the safest response is:

  • Do not rely only on app store age ratings.
  • Use parental controls on both the device and app store account.
  • Report questionable apps directly through the store listing.
  • Monitor what younger users are searching for and installing.

For discussion purposes, I think the bigger story is that AI abuse tools are now being normalized through mainstream distribution channels, not just shady websites. That makes platform accountability much more important.

Conclusion

If these apps were available, searchable, and labeled as suitable for children, that points to a significant review and policy enforcement failure. The report is concerning on both a digital safety and platform trust level.