Social media platforms operating in the UK “will have to assess the risk of any changes” made to content moderation and fact-checking policies, Ofcom told UKTN following a controversial strategy shift announced by Meta CEO Mark Zuckerberg.
The Meta boss revealed in a video announcement on Tuesday that the company would be abandoning the use of independent fact-checkers, which he claimed had become “too politically biased”.
He also said the company’s previously established content moderation system had led to “too much censorship”, which he in part put down to a changing political climate.
“Governments and legacy media have pushed to censor more and more. A lot of this is clearly political,” Zuckerberg said.
In statements reminiscent of X owner Elon Musk, Zuckerberg said he would be “restoring free expression” on his platforms in the face of political censorship.
The comments were likely in part a response to legislative moves from the UK to combat harmful content in digital spaces via the Online Safety Act.
Ofcom, the regulator tasked with enforcing the act told UKTN it will be “gathering information from various companies in the coming months on a number of matters regarding their safety processes”.
A spokesperson for the media watchdog said: “We’ve already been speaking to many tech firms – including Meta – about what they do now and what they will need to do as new duties in the UK come into force.
“Meta informed of us of these changes, some of which we understand apply to its US operations at this stage. In the UK, platforms will have to assess the risk of any changes they make and ensure they comply with their duties.”
The Online Safety Act – passed in late 2023 – has given Ofcom new powers in punishing platforms that host harmful content that includes hate speech, harassment, references to self-harm and suicide and weak protections against children viewing inappropriate material.
The regulator has not yet begun enforcing the full extent of the act, with enforcement expected in Spring 2025.
“Providers have until 16 March to assess the risks of illegal harms on their sites and apps. They’ll then need to start implementing safety measures to mitigate those risks, and we’ve set out measures they can take.”
Through the act, Ofcom can fine companies up to 10% of their global revenue if proper safeguards are not implemented.
Iona Silverman, a partner at the law firm Freeths, said the announcement from Zuckerberg “appears to fly in the face of the Online Safety Act which requires tech companies to prevent UK users, particularly children, from accessing harmful content”.
Silverman added: “The Online Safety Act was passed with the best of intentions: to protect people. However, it seems doomed to fail unless the regulators can move more quickly.”
Much of Meta’s existing content moderation practices were shaped by former British deputy prime minister Nick Clegg in his role as president of global affairs at Meta. Zuckerberg’s announcement comes just days after Clegg announced his resignation.
Technology Minister Baroness Jones previously told UKTN in response to controversial comments made by Musk: “Whatever the owner of X thinks, they can’t ignore UK legislation, it’s simply not possible,” said Jones. “They may not be happy, but the act is going to be the act and Ofcom is going to implement it.”
Listen: Podcast: How social media is preparing for the Online Safety Act – Almudena Lara, policy director, Ofcom
Register for Free
Get daily updates and enjoy an ad-reduced experience.
Already have an account? Log in