By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: ‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > ‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI
News

‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI

News Room
Last updated: 2026/02/05 at 7:38 AM
News Room Published 5 February 2026
Share
‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI
SHARE

On the veranda of her family’s home, with her laptop balanced on a mud slab built into the wall, Monsumi Murmu works from one of the few places where the mobile signal holds. The familiar sounds of domestic life come from inside the house: clinking utensils, footsteps, voices.

On her screen a very different scene plays: a woman is pinned down by a group of men, the camera shakes, there is shouting and the sound of breathing. The video is so disturbing Murmu speeds it up, but her job requires her to watch to the end.

Murmu, 26, is a content moderator for a global technology company, logging on from her village in India’s Jharkhand state. Her job is to classify images, videos and text that have been flagged by automated systems as possible violations of the platform’s rules.

On an average day, she views up to 800 videos and images, making judgments that train algorithms to recognise violence, abuse and harm.

Monsumi Murmu in forest near her home. Photograph: Anuj Behal

This work sits at the core of machine learning’s recent breakthroughs, which rest on the fact that AI is only as good as the data it is trained on. In India, this labour is increasingly performed by women, who are part of an workforce often described as “ghost workers”.

“The first few months, I couldn’t sleep,” she says. “I would close my eyes and still see the screen loading.” Images followed her into her dreams: of fatal accidents, of losing family members, of sexual violence she could not stop or escape. On those nights, she says, her mother would wake and sit with her.

In terms of risk, content moderation belongs in the category of dangerous work, comparable to any lethal industry

Milagros Miceli, sociologist

Now, she says, the images no longer shock her the way they once did. “In the end, you don’t feel disturbed – you feel blank.” There are still some nights, she says, when the dreams return. “That’s when you know the job has done something to you.”

Researchers say this emotional numbing – followed by delayed psychological fallout – is a defining feature of content moderation work. “There may be moderators who escape psychological harm, but I’ve yet to see evidence of that,” says Milagros Miceli, a sociologist leading the Data Workers’ Inquiry, a project investigating the roles of workers in AI.

“In terms of risk,” she says, “content moderation belongs in the category of dangerous work, comparable to any lethal industry.”

Studies indicate content moderation triggers lasting cognitive and emotional strain, often resulting in behavioural changes such as heightened vigilance. Workers report intrusive thoughts, anxiety and sleep disturbances.

A study of content moderators published last December, which included workers in India, identified traumatic stress as the most pronounced psychological risk. The study found that even where workplace interventions and support mechanisms existed, significant levels of secondary trauma persisted.

A slab extending from the mud wall of her house serves as Murmu’s desk. She uses a secondhand laptop to do content moderation work. Photograph: Anuj Behal

As early as 2021, an estimated 70,000 people in India were working in data annotation, which had a market value of about $250m (£180m) in 2021, according to the country’s IT industry body Nasscom. About 60% of revenues came from the US, while only 10% came from India.

About 80% of data-annotation and content moderation workers are drawn from rural, semi-rural ormarginalised backgrounds. Firms deliberately operate from smaller cities and towns, where rents and labour costs are lower, and a growing pool of first-generation graduates are seeking jobs.

Improvements in internet connectivity have made it possible to plug these locations directly into global AI supply chains, without relocating workers to cities.

Women form half or more of this workforce. For companies, women are seen as reliable, detail-oriented and more likely to accept home-based or contract work that could be seen as “safe” or “respectable”. These jobs offer rare access to income without migration.

A sizeable number of workers in these hubs come from Dalit and Adivasi (tribal) communities. For many of them, digital work of any kind represents an upward shift; cleaner, more regular and better-paid jobs than agricultural labour or mining.

A data annotation office in Ranchi, Jharkhand. Tech firms often set up offices in smaller cities. Photograph: Anuj Behal

But working from or close to home, can also reinforce women’s marginal position, according to Priyam Vadaliya, a researcher working on AI and data labour, formerly with the Bengaluru-based Aapti Institute.

“The work’s respectability, and the fact that it arrives at the doorstep as a rare source of paid employment, often creates an expectation of gratitude,” she says. “That expectation can discourage workers from questioning the psychological harm it causes.”

Raina Singh was 24 when she took up data-annotation work. A recent graduate, teaching had been her plan, but the certainty of a monthly income felt necessary before she could afford to pursue it.

She returned to her home town of Bareilly in Uttar Pradesh and each morning logged on from her bedroom, working through a third-party firm contracted for global technology platforms. The pay – about £330 a month – seemed reasonable. The job description was vague, but the work felt manageable.

I can’t even count how much porn I was exposed to. It was constant … the idea of sex started to disgust me

Raina Singh, data worker

Her initial assignments involved text-based tasks: screening short messages, flagging spam, identifying scam-like language. “It didn’t feel alarming,” she says. “Just dull. But there was something exciting too. I felt like I was working behind the AI. For my friends, AI was just ChatGPT. I was seeing what makes it work.”

But about six months in, the assignments changed. Without notice, Singh was moved to a new project tied to an adult entertainment platform. Her task was to flag and remove content involving child sexual abuse.

“I had never imagined this would be part of the job,” she says. The material was graphic and relentless. When she raised concerns with her manager, she recalls being told: “This is God’s work – you’re keeping children safe.”

Raina working on her laptop: ‘It didn’t feel alarming, just dull. But there was something exciting too.’ Photograph: Anuj Behal

Soon after, the task shifted again. Raina and six others on her team were instructed to categorise pornographic content. “I can’t even count how much porn I was exposed to,” she says. “It was constant, hour after hour.”

The work affected her personal life. “The idea of sex started to disgust me,” she says. She withdrew from intimacy and felt increasingly disconnected from her partner.

When Singh complained, the response was blunt: ‘your contract says data annotation – this is data annotation.’ She left the job, but a year on, she says the thought of sex can trigger a sense of nausea or dissociation. “Sometimes, when I’m with my partner, I feel like a stranger in my own body. I want closeness, but my mind keeps pulling away.”

Vadaliya says job listings rarely explain what the work actually involves. “People are hired under ambiguous labels, but only after contracts are signed and training begins do they realise what the actual work is.”

Remote and part-time roles are promoted aggressively online as “easy money” or “zero-investment” opportunities, and circulated through YouTube videos, LinkedIn posts, Telegram channels and influencer-led tutorials that frame the work as flexible, low-skilled and safe.

Hyderabad is home to India’s AI industry – far removed from the scattered rural locations where data is actually labelled. Photograph: Anuj Behal

The Guardian spoke to eight data-annotation and content-moderation companies in India. Only two said they provided psychological support to workers; the rest argued that the work was not demanding enough to require mental healthcare.

Vadaliya says that where there is support, the individual has to seek it out, shifting the burden of care on to workers. “It ignores the reality that many data workers, especially those coming from remote or marginalised backgrounds, may not even have the language to articulate what they are experiencing,” she says.

The absence of legal recognition of psychological harm in India’s labour laws, she adds, also leaves workers without meaningful protections.

Monsumi Murmu walks in the forest to help deal with the stresses of work. ‘I sit under the open sky and try to notice the quiet around me.’ Photograph: Anuj Behal

The psychological toll is intensified by isolation. Content moderators and data workers are bound by strict non-disclosure agreements (NDAs) that bar them from speaking about their work, even with family and friends. Violating NDAs can lead to termination or legal action.

Murmu feared that if her family understood her job, then she, like many other girls in her village, would be forced to leave paid employment and into marriage.

With just four months left on her contract, which pays about £260 a month, the spectre of unemployment keeps her from flagging concerns about her mental health. “Finding another job worries me more than the work itself,” she says.

In the meantime, she has found ways to live with the distress. “I go for long walks into the forest. I sit under the open sky and try to notice the quiet around me.”

Some days, she collects mineral stones from the land near her home or paints traditional geometric patterns on the walls of the house. “I don’t know if it really fixes anything,” says Murmu. “But I feel a little better.”

This article was amended on 5 February 2026. The top image was replaced with one that better reflected the editorial content.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Global VC Investment Surged In January, With U.S. Dominating Funding But A Pair Of AI Model IPOs In China Global VC Investment Surged In January, With U.S. Dominating Funding But A Pair Of AI Model IPOs In China
Next Article OpenCode: an Open-source AI Coding Agent Competing with Claude Code and Copilot OpenCode: an Open-source AI Coding Agent Competing with Claude Code and Copilot
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Haunting 6,500-year-old ‘foetal remains’ found buried alongside goat parts
Haunting 6,500-year-old ‘foetal remains’ found buried alongside goat parts
News
Avanade names Chris Howarth CEO
Avanade names Chris Howarth CEO
Mobile
In the age of AI, SAB Studios says Africans must own their stories
In the age of AI, SAB Studios says Africans must own their stories
Computing
Amazon's Alexa Plus Confounds Chris Hemsworth in Super Bowl Ad as the AI Tool Launches
Amazon's Alexa Plus Confounds Chris Hemsworth in Super Bowl Ad as the AI Tool Launches
News

You Might also Like

Haunting 6,500-year-old ‘foetal remains’ found buried alongside goat parts
News

Haunting 6,500-year-old ‘foetal remains’ found buried alongside goat parts

7 Min Read
Amazon's Alexa Plus Confounds Chris Hemsworth in Super Bowl Ad as the AI Tool Launches
News

Amazon's Alexa Plus Confounds Chris Hemsworth in Super Bowl Ad as the AI Tool Launches

6 Min Read
Super Bowl TV deals 2026: the best last-minute deals at Best Buy, Walmart and more
News

Super Bowl TV deals 2026: the best last-minute deals at Best Buy, Walmart and more

4 Min Read
ClearBank appoints ex-Visa exec as its new CTO – UKTN
News

ClearBank appoints ex-Visa exec as its new CTO – UKTN

2 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?