Meta Platforms Inc. announced today that it’s suing a company that advertised generative artificial intelligence apps on Meta’s platforms that enabled users to “nudify” people from a clothed image.
The lawsuit, launched in Hong Kong against Joy Timeline HK Ltd., states that the company consistently tried to circumvent Meta’s review process by advertising the CrushAI app — which allows users to virtually undress anyone they want, to “see anyone naked,” and usually without the person’s consent.
“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,” Meta said. “We’ll continue to take necessary steps, which could include legal action against those who abuse our platforms like this.”
The lawsuit came after the social media giant investigated so-called nudify apps, which, according to that investigation, are being sold and advertised across the internet. It’s long been known that people abuse deepfake technology to create sexualized images, but it seems nudifying apps have so far been allowed to proliferate without much pushback.
According to research by Alexios Mantzarlis, director of Cornell Tech’s Security, Trust and Safety Initiative, from fall last year to the start of this year, Crush AI had more than 8,000 ads on Facebook and Instagram. Meta doesn’t allow “nonconsensual intimate imagery” on its platforms, but it says such apps are hard to detect by nudity detection technology because of the benign nature of their ads or the fact that they create new domain names after their websites have been blocked.
“We’ve worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases, and emojis that our systems are trained to detect with these ads,” Meta said.
It’s reported that the apps are often used to undress celebrities virtually, but a more pressing concern is that they could be used with images of children. In 2024, two teens in Florida were arrested after they’d been found to have used generative AI to create sexualized images of their classmates. The nudify apps, reportedly currently sold in app stores, would make such activity very easy to accomplish.
In the U.S., the Take It Down Act was passed in the House in April this year and was later signed into law by President Donald Trump. The law criminalizes the publication of nonconsensual sexually explicit deepfake videos and images, while making it easier for victims to have the images removed.
Photo: Unsplash
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU