A federal appeals court has revived a case that alleges that X was negligent in its handling of child sexual abuse material (CSAM), and “slow-walked its response” to reports.
The Ninth US Circuit Court of Appeals, in San Francisco, said X must face a claim that it failed to promptly report a video containing explicit images of two underage boys to the National Center for Missing and Exploited Children (NCMEC). The lawsuit was originally filed in 2021, before Elon Musk’s 2022 takeover of the platform.
In one example, the plaintiffs claim that a 13-year-old boy was tricked into sharing explicit images of himself via Snapchat, which were then shared on X. These images were later reported by the boy and his mother, who supplied ID and complied with X’s reporting processes. However, X allegedly took nine days to take the offending content down, at which point it had already racked up over 167,000 views and 2,000 retweets and had circulated around his high school.
The suit alleged that X “passed on opportunities to develop better tools” to stop the spread of this type of content, “despite the inadequacy of its existing infrastructure.” It went on to claim that due to X’s business model, “it receives significant advertising revenue from hosting sought-after or popular posts, including those that depict pornographic content featuring minors.”
The plaintiffs also pointed to numerous limitations in X’s child abuse material reporting processes. These included not allowing users to report child pornography sent via private messaging, requiring reporters to supply an email address, and requiring the reporter to have and be logged into a Twitter account.
Judge Danielle Forrest found section 230 of the federal Communications Decency Act, which protects online platforms from liability over user content, didn’t grant X immunity from the negligence claim after it learned about the offending material. However, it was found immune from allegations it benefited from sex trafficking.
Recommended by Our Editors
Adult content represented a large chunk of X at the time of the case. A Reuters report from 2022 found that 13% of all content on the platform was adult material, citing internal documents. Meanwhile, problems with CSAM have persisted on X. According to X’s 2024 January and June Transparency Report, 2.78 million accounts were deactivated for child sex material violations, though this fell to 132,155 in the October 2024 to March 2025 period.
X has yet to comment on the court’s decision.
Get Our Best Stories!
Your Daily Dose of Our Top Tech News
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!