Meta allegedly gave accounts engaged in the “trafficking of humans for sex” 16 chances before suspending them, according to testimony from the company’s former head of safety and well-being, Vaishnavi Jayakumar. The testimony — along with several other claims that Meta ignored problems if they increased engagement — surfaced in an unredacted court filing related to a social media child safety lawsuit filed by school districts across the country.
“That means that you could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” Jayakumar said during her deposition. She added that this “is a very high strike threshold” by “any measure across the industry,” according to the lawsuit. Internal documentation also “confirms” this policy, lawyers claim.
As reported by Time, the unredacted filing reveals other disturbing accusations, including that Meta “did not have a specific way” for Instagram users to report child sexual abuse material (CSAM) on the platform. When Jayakumar learned about this, she reportedly “raised this issue ‘multiple times,’ but was told that it would be too much work to build” and to review reports.
The filing reveals multiple instances in which Meta is accused of downplaying the harms of its platforms in favor of boosting engagement. In 2019, Meta considered making all teen accounts private by default in order to prevent them from receiving unwanted messages; however, the company allegedly rejected the idea after the growth team found it would “likely smash engagement.” Meta started putting teens on Instagram into private accounts last year.
The lawsuit also claims that while Meta researchers found that hiding likes on posts would make users “significantly less likely to feel worse about themselves,” the company walked back these plans after finding it was “pretty negative to FB metrics.” Meta is similarly accused of reinstating beauty filters in 2020, even after finding that they were “actively encouraging young girls into body dysmorphia.” Taking away the filters could have “negative growth impact, simply because any restriction is likely to reduce engagement if people go elsewhere,” Meta said, the lawsuit alleges.
“We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading picture,” Meta spokesperson Andy Stone said in an emailed statement to The Verge. “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens — like introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens’ experiences.”
