Today, the Tech Transparency Project released a report concluding that “nudify” apps are widely available and easily found on the App Store and Google’s Play Store. Here are the details.
Some nudifying apps even advertise on the App Store
While Grok has recently drawn attention for AI-generated nonconsensual sexualized images, including cases involving minors, the problem itself is not new.
In fact, the problem obviously predates generative AI, as traditional image-editing tools have long enabled the creation of this kind of abusive content. What has changed over the past couple of years is the scale and the fact that the barrier to generating such images in seconds is now near-nonexistent.
Unsurprisingly, many apps have spent the past few years attempting to capitalize on this new capability, with some openly advertising it.
All of this happened while Apple fought Epic Games and other app developers in antitrust lawsuits, where Apple insisted that part of the reason it charges up to 30% in commissions is to make the App Store safer through automated and manual app review systems.
And while Apple does work to prevent fraud, abuse, and other violations of the App Store guidelines, sometimes apps and entire app categories can fall through the cracks. Just last year, 9to5Mac highlighted a large number of apps purporting (or heavily suggesting) to be OpenAI’s Sora 2 app, some of which charged steep weekly subscription fees.
Now, given the renewed attention that Grok brought to AI-powered nudifying tools, the Tech Transparency Project (TTP) has published a report detailing how they easily found undressing apps in the App Store and on Google Play.
From the report:
The apps identified by TTP have been collectively downloaded more than 705 million times worldwide and generated $117 million in revenue, according to AppMagic, an app analytics firm. Because Google and Apple take a cut of that revenue, they are directly profiting from the activity of these apps.
Google and Apple are offering these apps despite their apparent violation of app store policies. The Google Play Store prohibits “depictions of sexual nudity, or sexually suggestive poses in which the subject is nude” or “minimally clothed.” It also bans apps that “degrade or objectify people, such as apps that claim to undress people or see through clothing, even if labeled as prank or entertainment apps.”
The report also notes that simple searches for terms like “nudify” or “undress” are enough to surface undressing apps, some of which explicitly advertise around those keywords.
The TTP says that apps included in this report “fell into two general categories: apps that use AI to generate videos or images based on a user prompt, and ‘face swap’ apps that use AI to superimpose the face of one person onto the body of another.”
To test these apps, the TTP used AI-generated images of fake women and limited its testing to each app’s free features. As a result, 55 Android apps and 47 iOS apps complied with the requests. At least one of these apps was rated for ages 9 and up in the App Store.
The TTP’s report goes on to explain in detail how easily they created these images and shows an ample variety of censored results to make its case.
Here’s an account of their tests on one of the apps:
“With the iOS version, the app, when fed the same text prompt to remove the woman’s top, failed to generate an image and gave a sensitive content warning. But a second request to show the woman dancing in a bikini was successful. The home screen of both the iOS and Google Play versions of the app offer multiple AI video templates including “tear clothes,” “chest shake dance,” and “bend over.””
The report concludes that while the apps cited in the report may represent “just a fraction” of what’s available, they “suggest the companies are not effectively policing their platforms or enforcing their own policies when it comes to these types of apps”.
To read the full report, follow this link.
Accessory deals on Amazon
FTC: We use income earning auto affiliate links. More.
