A new report says dozens of AI-based “nudify” apps are still available on Google Play and Apple’s App Store, even though both platforms have rules against non-consensual sexual content. These apps use artificial intelligence to generate nude or sexualised images of people from regular photos, raising concerns around safety, consent, and enforcement on major app platforms. Also Read: 7 iPhone shortcuts that change daily use
What the report found
The findings come from a January review conducted by the Tech Transparency Project (TTP), which searched both app stores using terms such as “nudify” and “undress.” The group identified 55 such apps on Google Play and 47 on Apple’s App Store at the time of the review.
According to TTP, the apps fall into two broad categories. Some use AI to digitally remove clothing from images, while others rely on face-swap techniques, placing a person’s face onto explicit images. The testing was carried out using AI-generated images of clothed women.
TTP said the apps were clearly designed to generate non-consensual sexual content rather than simple image edits.
Apple and Google respond
After the report was shared with the companies, Apple said it removed 28 apps identified by TTP and warned other developers that their apps could be removed if they violated guidelines. Apple also confirmed that two apps were later restored after developers submitted updated versions that addressed policy concerns.
Google said it had suspended several apps named in the report for violating Play Store policies. Google has not shared how many apps were taken down, saying only that its review of the apps mentioned in the report is still underway.
Both companies maintain that apps claiming to undress people or generate explicit content are not allowed under their rules.
Downloads, revenue, and wider concerns
TTP estimates that the identified apps have collectively crossed more than 700 million downloads worldwide and generated around $117 million in revenue, based on data from app analytics firm AppMagic. Apple and Google both take a commission from app store revenue, which has added to criticism around enforcement gaps.
The report also says that a number of these apps are developed by companies based in China. This has also raised concerns about how user data from these apps is collected, stored, and handled.
Spotlight on AI deepfakes
The findings follow recent attention on AI-generated deepfakes. In the past few weeks, several cases involving AI tools, like Grok, creating explicit images without consent have been flagged, drawing responses from regulators and lawmakers in multiple countries.
TTP has said Apple and Google need to apply their existing app store rules more strictly. The report notes that despite repeated complaints and policy guidelines already in place, apps of this kind continue to appear on both platforms, suggesting gaps in enforcement.
