Edgar Cervantes / Android Authority
TL;DR
- A new report found dozens of AI “nudify” and face-swap apps on the Apple App Store and Google Play Store.
- Despite policies against sexual nudity, these apps were distributed through the app stores. Many of the apps generated significant revenue from in-app purchases, of which the platforms take as much as a 30% cut.
- Many of these apps were removed following the report, but several are still available for download.
AI is one of the most powerful tools available these days, and you can do so much with it. But, like any tool, it can be used for good or bad. Over the past few weeks, we’ve witnessed social media users using X’s Grok AI chatbot in lewd ways, primarily to create sexualized imagery of women without their consent. As it turns out, it’s not just Grok that has been at the center of this undressing scandal, as both the Apple App Store and the Google Play Store allegedly hosted “nudify” apps.
Don’t want to miss the best from Android Authority?
Tech Transparency Project found 55 apps in the Google Play Store that allowed for creating nude images of women, while the Apple App Store hosted 47 such apps, with 38 being common between the two stores. These apps were available as of last week, though the report mentions that Google and Apple subsequently removed 31 and 25 apps, respectively, after the list of these nudify apps was shared with them.
For their investigation, the Tech Transparency Project searched for terms like “nudify” and “undress” across the two app stores and found dozens of results. Many of these apps used AI to either generate videos or images from user prompts, or to superimpose the face of one person onto another’s body, i.e., “face swapping.”
Alarmingly, apps like DreamFace (an AI image/video generator, which is still available on the Google Play Store but has been removed from the Apple App Store) presented no resistance when users input lewd prompts to show naked women. The app allows users to create one video a day for free using prompts, after which they have to subscribe to paid features. The report cites AppMagic statistics to say that the app has generated $1 million in revenue.
Keep in mind that Google and Apple both charge a significant percentage (up to 30%) on in-app purchases, such as subscriptions, effectively profiting from such harmful apps.
Similarly, Collart is another AI image/video generator that is still available on the Google Play Store but has been removed from the Apple App Store. This app is said to not only accept prompts to nudify women, but would also accept prompts to depict them in pornographic situations with no apparent restrictions. These are just two examples, but the report mentions several more with damning evidence.
Face swap apps are even more harmful and predatory, as they superimpose faces that the user potentially knows onto naked bodies. Apps like RemakeFace are still available on both the Google Play Store and the Apple App Store at the time of writing, and the report confirms that they could easily be used to create non-consensual nudes of women.
Both the Google Play Store and the Apple App Store prohibit apps that depict sexual nudity. But as the report revealed, these apps were distributed through app stores (with in-app subscriptions) despite their apparent violation of those stores’ policies. It’s clear that app stores haven’t kept up with the spread of AI deepfake apps that can “nudify” people without their permission. The platforms have a clear responsibility to protect users, and we hope they tighten their guidelines and focus more on proactive monitoring of such apps rather than reacting to reports.
We reached out to Google and Apple for comments on this matter. We’ll update this article when we hear back from the companies.
Thank you for being part of our community. Read our Comment Policy before posting.
