The Facebook founder Mark Zuckerberg announced on Tuesday his company, Meta, would be scrapping factcheckers in the US, accusing them of making biased decisions and saying he wanted to enable greater free speech. Meta uses third-party independent factcheckers around the world. Here, one of them who works for the Full Fact organisation in London, explains what they do and reacts to Zuckerberg’s “dispiriting” allegation.
I have been a factchecker at Full Fact in London for a year, investigating suspicious content on Facebook, X and in newspapers. Our bread and butter includes a lot of video disinformation about wars in the Middle East and Ukraine and AI-generated fake video clips of politicians, which are getting harder to disprove. Colleagues work on Covid disinformation, cancer cure hoaxes and there’s a lot of climate stuff as we’re seeing more hurricanes and wildfires.
As soon as we log on at 9am, we’re assigned something to look at. Our access to Meta’s systems shows us which posts are most likely to be false. Sometimes there are 10 or 15 different things that seem harmful and it can feel overwhelming. But we can’t check everything.
If a post is slightly wild but not harmful, like the AI-generated image of the pope in a gigantic white puffer coat, we might leave it. But if it’s a fake image of Mike Tyson holding a Palestine flag, we are more likely to tackle it. We pitch those in our morning meeting and then get commissioned to start checking.
Yesterday I was working on a deepfake video of Keir Starmer saying that a lot of the claims around Jimmy Savile were frivolous, and that’s why he didn’t prosecute at the time. It is getting a lot of engagement. Starmer’s mouth didn’t look right and it didn’t seem like something he would say. It looked like misinformation. I got straight into reverse image searching and discovered the video was taken from the Guardian in 2012. The original was much higher quality. You can see exactly what he’s saying compared with the one that’s being shared on social media, which is very blurry around the mouth. We contacted the Guardian to check about the original, Downing Street for comment and we can get in touch with various media forensics and deepfake AI experts.
Some disinformation keeps resurfacing. There’s a particular video of a petrol station explosion that was in Yemen last year that is reused as showing either a bombing in Gaza or a Hezbollah attack on Israel.
The factchecker gathers examples of where it’s appeared on social media in the last 24 hours or so, often how many likes or shares it has, and sets out how we know it isn’t right.
There are two levels of review before we can attach a fact check to a Facebook post. Senior colleagues question every leap of logic we’ve made. If something is a repeat claim this process could be done in half a day. New, more complex cases could take nearly a week. The average is about a day. Sometimes the back and forth can feel frustrating, but we need to be as near to 100% sure as possible.
It was quite difficult to hear Mark Zuckerberg say on Tuesday that factcheckers were biased. So much of the work we do is about being impartial and that is instilled in us. It feels like a very important job where I am making a difference and providing good information for people.
It’s what I wanted to do in my previous work in local journalism, going down the rabbit hole, tracing the sources, but there wasn’t a lot of opportunity. It was a lot of churnalism. As a local reporter I was worried by the amount of conspiracy theories that people genuinely engage with and believe on Facebook groups and I felt powerless.
At the end of the working day it can be hard to switch off. I’m still thinking about how I can prove something as quickly as possible. Watching the shares and the likes on the piece of content going up all the time is a bit concerning. But when the fact check is published it’s a satisfying feeling.
Zuckerberg’s decision was dispiriting. We put a lot of work into this, and we think this really matters. However, there is a renewed sense that we are determined to fight the good fight. Misinformation isn’t going to disappear. We’re still going to be here, working against that.