The big lie of artificial intelligence (AI) is one of omission – its egregious impact on both people and place.
The convenient narrative the Broligarchy sells to us is that it’s all about those clever machines, the ones and zeros of code and those super-clever algorithms that are creating a new world order from which all will benefit.
Just like they sold us the dream of our data in the cloud – a soft and floaty, ephemeral sort of nowhere that we could no more reach for than our real clouds in the sky.
The reality is that what’s really the “cloud” is a larger carbon footprint than the airline industry and decidedly physical, as manifested in water-guzzling datacentres and extractive mining in environmentally challenging locations.
It’s about the people
But let’s return to the people issue. Data labelling is the ground floor of AI without which it simply could not exist – and it’s done exclusively by humans not machines, in grueling conditions, some in the US but primarily in the Philippines and the rest of the Global South.
ChatGPT relies on these humans to train its data sets accurately. Critical though these people are to their business model, they are not considered “tech workers” or employees of companies like Meta, Google or OpenAI, but are rather outsourced workers with none of the benefits normally enjoyed by the tech sector. Not for them the on-campus food, the free transport, the wellness days and the exorbitantly high salaries. No, many of these people work in their homes or in internet cafes when they can’t afford to own a computer themselves.
For years the tech giants have made sure these workers sign non-disclosure agreements forbidding them to speak about the nature of their work and their terms and conditions, but more recently some of these content moderators and data labelers have started to agitate to improve their plight – as evidenced by an open letter from Kenyan moderators to President Biden at the end of his term in 2024. Their letter is a chilling insight into the condition of their lives and work:
“As tech workers, we are proud to play a role in the development and training of world-class emerging technologies – and, crucially, making them safe. We do this work at great cost to our health, our lives and our families. US tech giants export their toughest and most dangerous jobs overseas. The work is mentally and emotionally draining. We scrub Facebook, TikTok and Instagram to ensure these important platforms do not become awash with hate speech and incitement to violence. We label images and text to train generative AI tools like ChatGPT for OpenAI. Our work involves watching murder and beheadings, child abuse and rape, pornography and bestiality, often for more than eight hours a day. Many of us do this work for less than $2 per hour.”
Hundreds of beheadings
A brilliant documentary called The cleaners, by first-time directors Moritz Riesewieck and Hans Block, premiered at the Sundance film festival in 2019, and focuses on five content moderators in the Philippines, interspersing their human stories with the geopolitical implications of their work.
It’s tragic to hear one woman calmly talking in the documentary about having witnessed hundreds of beheadings as if that was alright for any human being to endure for the cost of $2 an hour. Fascinating too, is the intercutting of the content moderators’ stories with cheesy Facebook internal videos showing Mark Zuckerberg warbling on about how brilliant Facebook is for the world.
And it’s not like Zuckerberg is unaware of these appalling work practices and abuses – when he was asked what he thought about it he replied, in a leaked company meeting recording, “Some of the reports, I think, are a little overdramatic”.
Writer and researcher Sarah T Roberts spent over 10 years researching the lives and working conditions of content moderators around the globe and her book, Behind the screen: Content moderation in the shadows of social media, documents their lives in detail. One of her interviewees, named only as “Melinda,” described her work as being that of a “sin eater,” which encapsulates beautifully the economics of the situation where vastly wealthy tech owners in the US exploit the labour of the poorest people in the world:
“A figure of folklore thought to be prevalent in Wales and England, the sin-eater was seen as a purifier in that he or she would eat, through means of bread or ale passed over the corpse of a recently deceased person, that person’s mortal sins. Often a poor member of the community, the sin-eater would be compensated with financial remuneration for the taking on of another’s sins through this eating. In this way, those who were economically precarious were more likely to ingest the sins of others. It was this forgotten and tragic personage of British legend, and not with anyone from the tech or social media sectors, with whom Melinda strongly identified through her work as a commercial content moderator.”
Racial and social justice
The lack of racial and social justice is not confined to the Global South. This work is done in the shadows of big tech in the US too and it’s far from a recent development. As far back as 2014, a videographer named Andrew Norman Wilson was fired from Google when he exposed the work of a shadow workforce tasked with scanning and digitising Google Books. In his YouTube video, he describes a hierarchy in Google delineated by wearing different coloured badges.
Contractors such as himself wore red badges. Interns wore green badges. Full-time Google staffers wore white badges and were entitled to all the perks and advantages of working there.
But he discovered another class of worker – mostly people of colour, Latino and Black, who were working in the building next to his. Their shifts started at 4am and lasted to 2.15pm very much like factory workers, the antithesis of the freewheeling work-from-home, come-in-when-you-like tech workers. These workers were not allowed to set foot anywhere else on the Google campus except for the office they worked in, called Building 31459.
They were not given corporate backpacks, mobile devices, thumb drives, or afforded any opportunity for social interaction with other Google employees. In fact, most Google employees didn’t know about the yellow-badge class in building 31459. They were expressly forbidden to talk about their work to any other Googlers, something Wilson found to his cost when he was fired for trying to video them leaving work and also for trying to speak to some of them.
When he approached one worker, she quickly used her mobile to call Google security – as she had been instructed to do. Hours later Wilson’s contract was abruptly terminated and he was escorted from the building. Wilson’s dilemma as a philosophy and politics major was trying to understand why people of colour would be treated as third-class citizens by Google, with their yellow badges, doing critical work for the company that was both unacknowledged and diminished.
AI hype
Authors Emily M Bender and Alex Hanna explore these issues in their deep dive into AI hype in their book, The AI con: How to fight Big Tech’s hype and create the future we want. According to the authors, “AI hype at work is designed to hide the moves employers make towards the degradation of jobs and the workplace behind the shiny claims of techno-optimism.
“It spins a vision of the future where automation means that people are freed up to do interesting work while machines take over tedious tasks. But when we look behind the curtain, we see instead that automation is being wielded as a cudgel against workers and trotted out as a cost-saving device for employers, leaving workers the tasks of cleaning up after it, tasks that are devalued and more poorly paid while also being less creative, engaging, and fulfilling – or at worst, outright traumatic to carry out.”
And while it’s convenient for us to blame Big Tech – and they surely bear most of the burden – to be human is surely to care for our fellow human beings?
When we use these platforms, we cannot pretend that for us to have our shiny platforms, our memes and mementos, we are doing so at a very real human cost to people least able to defend their rights.
We cannot not know this, surely – it’s been going on for over two decades. The question you need to ask yourself is, what are you going to do about your part in it?