‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI | Global development
Inside the Hidden Workforce Powering AI: The Ghost Workers of India
In a quiet village in Jharkhand, India, 26-year-old Monsumi Murmu sits on her family’s veranda, laptop balanced on a mud slab built into the wall—one of the few spots where the mobile signal holds steady. Around her, the familiar sounds of rural life echo: clinking utensils, footsteps, voices. But on her screen unfolds a very different reality.
A woman pinned down by a group of men. The camera shakes. Shouting. Heavy breathing. The video is so disturbing that Murmu speeds it up—yet her job requires her to watch until the end.
Murmu is a content moderator for a global technology company, working from her village in one of India’s most remote states. Her daily task: classify images, videos, and text flagged by automated systems as potential violations of platform rules. On an average day, she reviews up to 800 pieces of content, making split-second judgments that train algorithms to recognize violence, abuse, and harm.
This work sits at the core of machine learning’s recent breakthroughs. AI is only as good as the data it’s trained on, and in India, this labor is increasingly performed by women—part of a workforce often described as “ghost workers,” invisible hands that make the internet function.
“The first few months, I couldn’t sleep,” Murmu recalls. “I would close my eyes and still see the screen loading.” Images followed her into her dreams: fatal accidents, losing family members, sexual violence she couldn’t stop or escape. On those nights, her mother would wake and sit with her.
Now, she says, the images no longer shock her the way they once did. “By the end, you don’t feel disturbed—you feel blank.” There are still some nights when the dreams return. “That’s when you know the job has done something to you.”
Researchers say this emotional numbing—followed by delayed psychological fallout—is a defining feature of content moderation work. “There may be moderators who escape psychological harm, but I’ve yet to see evidence of that,” says Milagros Miceli, a sociologist leading the Data Workers’ Inquiry, a project investigating the roles of workers in AI.
“In terms of risk,” she says, “content moderation belongs in the category of dangerous work, comparable to any lethal industry.”
Studies indicate content moderation triggers lasting cognitive and emotional strain, often resulting in behavioral changes such as heightened vigilance. Workers report intrusive thoughts, anxiety, and sleep disturbances. A study of content moderators published last December, which included workers in India, identified traumatic stress as the most pronounced psychological risk. The study found that even where workplace interventions and support mechanisms existed, significant levels of secondary trauma persisted.
As early as 2021, an estimated 70,000 people in India were working in data annotation, which had a market value of about $250 million in 2021, according to the country’s IT industry body Nasscom. About 60% of revenues came from the US, while only 10% came from India.
About 80% of data-annotation and content moderation workers are drawn from rural, semi-rural, or marginalized backgrounds. Firms deliberately operate from smaller cities and towns, where rents and labor costs are lower, and a growing pool of first-generation graduates are seeking jobs.
Improvements in internet connectivity have made it possible to plug these locations directly into global AI supply chains, without relocating workers to cities.
Women form half or more of this workforce. For companies, women are seen as reliable, detail-oriented, and more likely to accept home-based or contract work that could be seen as “safe” or “respectable.” These jobs offer rare access to income without migration.
A sizeable number of workers in these hubs come from Dalit and Adivasi (tribal) communities. For many of them, digital work of any kind represents an upward shift—cleaner, more regular, and better-paid jobs than agricultural labor or mining.
But working from or close to home can also reinforce women’s marginal position, according to Priyam Vadaliya, a researcher working on AI and data labor, formerly with the Bengaluru-based Aapti Institute.
“The work’s respectability, and the fact that it arrives at the doorstep as a rare source of paid employment, often creates an expectation of gratitude,” she says. “That expectation can discourage workers from questioning the psychological harm it causes.”
Raina Singh was 24 when she took up data-annotation work. A recent graduate, teaching had been her plan, but the certainty of a monthly income felt necessary before she could afford to pursue it.
She returned to her hometown of Bareilly in Uttar Pradesh and each morning logged on from her bedroom, working through a third-party firm contracted for global technology platforms. The pay—about £330 a month—seemed reasonable. The job description was vague, but the work felt manageable.
Her initial assignments involved text-based tasks: screening short messages, flagging spam, identifying scam-like language. “It didn’t feel alarming,” she says. “Just dull. But there was something exciting too. I felt like I was working behind the AI. For my friends, AI was just ChatGPT. I was seeing what makes it work.”
But about six months in, the assignments changed. Without notice, Singh was moved to a new project tied to an adult entertainment platform. Her task was to flag and remove content involving child sexual abuse.
“I had never imagined this would be part of the job,” she says. The material was graphic and relentless. When she raised concerns with her manager, she recalls being told: “This is God’s work—you’re keeping children safe.”
Soon after, the task shifted again. Singh and six others on her team were instructed to categorize pornographic content. “I can’t even count how much porn I was exposed to,” she says. “It was constant, hour after hour.”
The work affected her personal life. “The idea of sex started to disgust me,” she says. She withdrew from intimacy and felt increasingly disconnected from her partner.
When Singh complained, the response was blunt: “your contract says data annotation—this is data annotation.” She left the job, but a year on, she says the thought of sex can trigger a sense of nausea or dissociation. “Sometimes, when I’m with my partner, I feel like a stranger in my own body. I want closeness, but my mind keeps pulling away.”
Vadaliya says job listings rarely explain what the work actually involves. “People are hired under ambiguous labels, but only after contracts are signed and training begins do they realize what the actual work is.”
Remote and part-time roles are promoted aggressively online as “easy money” or “zero-investment” opportunities, and circulated through YouTube videos, LinkedIn posts, Telegram channels, and influencer-led tutorials that frame the work as flexible, low-skilled, and safe.
The Guardian spoke to eight data-annotation and content-moderation companies in India. Only two said they provided psychological support to workers; the rest argued that the work was not demanding enough to require mental healthcare.
Vadaliya says that where there is support, the individual has to seek it out, shifting the burden of care onto workers. “It ignores the reality that many data workers, especially those coming from remote or marginalized backgrounds, may not even have the language to articulate what they are experiencing,” she says.
The absence of legal recognition of psychological harm in India’s labor laws, she adds, also leaves workers without meaningful protections.
The psychological toll is intensified by isolation. Content moderators and data workers are bound by strict non-disclosure agreements (NDAs) that bar them from speaking about their work, even with family and friends. Violating NDAs can lead to termination or legal action.
Murmu feared that if her family understood her job, then she, like many other girls in her village, would be forced to leave paid employment and into marriage.
With just four months left on her contract, which pays about £260 a month, the specter of unemployment keeps her from flagging concerns about her mental health. “Finding another job worries me more than the work itself,” she says.
In the meantime, she has found ways to live with the distress. “I go for long walks into the forest. I sit under the open sky and try to notice the quiet around me.”
Some days, she collects mineral stones from the land near her home or paints traditional geometric patterns on the walls of the house. “I don’t know if it really fixes anything,” says Murmu. “But I feel a little better.”
Tags & Viral Phrases:
Ghost workers
AI’s hidden workforce
Content moderation trauma
Digital labor exploitation
Women powering AI
Mental health crisis in tech
The unseen cost of AI
Dangerous digital jobs
Rural workers in global tech
Psychological harm in content moderation
Data annotation reality
AI’s dirty secret
The human cost of automation
Working behind the AI curtain
The ghost workers powering your feeds
The invisible hands of the internet
Content moderation: the deadliest job in tech
The women cleaning the internet
AI’s most dangerous job
The trauma behind your timeline
The silent suffering of content moderators
The real workers behind ChatGPT
The dark side of digital labor
The hidden human cost of AI
The ghost workers you never knew existed
The women risking their mental health for AI
The unseen trauma of content moderation
The human toll of automated systems
The dangerous work powering machine learning
The invisible workforce behind your screen
The women cleaning up the internet’s darkest corners
The psychological damage of digital labor
The hidden human cost of your social media feed
,




Leave a Reply
Want to join the discussion?Feel free to contribute!