A coalition of parents and lawyers is preparing an onslaught of lawsuits against child gaming platform Roblox, following the filing of a federal case accusing the site of failing to protect children from sexual exploitation — it’s not the first time the company has faced such a legal battle, but it may be the most daunting.
Last week, Louisiana Attorney General Liz Murrill filed a lawsuit accusing Roblox of “knowingly and intentionally” failing to institute appropriate safety protocols to protect young users from predatory behavior and child sex abuse materials (CSAM). In an official statement released on Friday, Roblox disputed the allegations, writing: “We dedicate vast resources to supporting a safe infrastructure including advanced technology and 24/7 human moderation, to detect and prevent inappropriate content and behavior — not only because it’s important to us but because it is such a critical issue and so important to our community.”
Explaining the phenomenon known as ‘AI psychosis’
The first of the successive lawsuits is being filed on behalf of parents and their underage children by Dolman Law Group, which has already submitted five such complaints. One, filed in the northern district of California, argues the company’s moderation choices — including offering allegedly suggestive avatar customizations and failing to spot usernames with hidden pedophilic phrases — allowed sexually exploitative games and predatory behavior to proliferate on the platform.
Mashable Light Speed
Recent criticism of the site’s safety policies hinges on the effectiveness of the platform’s new open source AI moderation system, known as “Sentinel,” designed to proactively monitor chats and detect potential signs of child endangerment, including grooming. According to Roblox, Sentinel has flagged around 1,200 attempts at child exploitation in the first half of 2025, which are reported to the National Center for Missing and Exploited Children (NCMEC).
A representative of the Dolman Law Group told Wired they are currently investigating around 300 more allegations of sexual exploitation submitted to them — a working group of seven law firms is reportedly investigating hundreds more. Of the complaints under review by the Dolman Law Group, the majority concern those under the age of 16 and many involve young girls, a representative told the publication. Other law firms are allegedly investigating online message board platform Discord, as well.
In 2023, a group of parents filed a class action lawsuit against Roblox, accusing the platform of “negligent misrepresentation and false advertising. ” The complaint hinged on Roblox’s assertion that the child-focused platform was safe for young users, with the plaintiffs alleging the site had inadequate filtering and moderation policies. Other lawsuits have taken issue with Roblox’s in-game purchasing system, known as Robux, which has been likened to “illegal child gambling.”
Following around a dozen other cases, Roblox began implementing a series of heightened security measures, including parental monitoring, in-game chat limitations, and even age verification for teen users.