Meta, the parent company of Facebook and Instagram, apologized Thursday for a technical error that resulted in some users’ Reels feeds being filled with graphic or violent content.
“We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended,” a Meta spokesperson said in a statement shared with The Hill Thursday. “We apologize for the mistake.”
The videos, featured on some users’ Reels’ tab, showed people apparently being shot to death or run over by vehicles, the Wall Street Journal reported. Some users still saw the content even when turning on Instagram’s “Sensitive Content Control” to its highest moderation setting, CNBC reported.
Under its current policy, the platform typically removes most graphic content and puts warning labels on potentially sensitive content. The most graphic prohibited content includes videos of dismemberment, “visible innards” like exposed organs, burning or charred individuals or throat-slitting.
Certain sadistic remarks and livestreams of capital punishment are also prohibited.
Users under 18 might have even more restricted ability to view such content, according to the company.
More than 15,000 reviewers around the world help detect and review potential violations on Facebook and Instagram, according to Meta’s website.
The majority of prohibited content is automatically removed with machine learning models, while potentially violating content is sent to review teams to check and further evaluate it.
The temporary error is not related to any other recent content policy changes, Meta confirmed.
Last month, Meta announced it would eliminate its fact-checking program and replace it with a “Community Notes” feature that relies on users reporting and sending relevant context on what they believe is misleading or false information.
The move sparked backlash from several tech advocacy groups concerned it would lead to an increase of misinformation and disinformation on the platforms.