Data Workers Fight Back

February 21, 2026

by Adele

Worldwide, over 500,000 content moderators work for global internet corporations. On platforms, especially social media, they view and evaluate images, videos and text flagged by users and automated systems as possibly violating the rules. Workers can delete violent, abusive, obscene, or offensive content. They also classify it to train algorithms to recognize harmful imagery. Algorithms in social media are used in artificial intelligence (AI) determining how content is displayed to users. The dictionary defines AI as the ability of a digital computer to perform tasks commonly associated with intelligent beings. The idea is for AI to eventually perform content moderation and other tasks without human help, but it is controversial to what extent this is possible.

AN OPPRESSIVE OCCUPATION

Content moderation is performed by an increasingly female and young adult workforce in low-income countries, especially in the Global South. CC BY 4.0

There are data workers training AI for numerous uses ranging from medicine to engineering. Content moderation, as a specific type of data work, is performed by an increasingly female and young adult workforce in low-income countries, especially in the Global South. These include India, the Philippines and countries in Latin America, the Middle East and Africa. These workers experience extreme post-traumatic stress impacting their lives long after employment ends. Workers sign non-disclosure agreements not to discuss content with family or friends. This is an obstacle to organizing and informing the public about their oppression.

Milagros Miceli, a sociologist, created and runs the Data Workers’ Inquiry (DWI). Its website says it is a “joint project of DAIR Institut, Weizenbaum Institut, and TU Berlin.” DWI “brings together experts in sociology, political science, philosophy, and labor studies to uncover the hidden labor behind AI, to challenge the status quo and push for change.” It “is a global, radically participatory research initiative spanning nine countries across five continents. Here, data workers themselves become community researchers, identifying urgent issues, formulating their own questions, and choosing the formats that best tell their stories: zines, documentaries, comics, essays, podcasts, and animations can be downloaded from the project repository.”

DWI adapts Karl Marx’s 1880 Workers’ Inquiry to data workers. Originally 100 questions, this was a tool for industrial and agricultural workers describing their work and their oppression. The answers were published and used to help workers organize for change. DWI’s website includes workers’ articles describing harrowing working conditions as well as the impact of constantly viewing violent material. These are also described in Miceli’s 2024 documentary “Humans in the Loop” and her 2025 study on the psychological effects of this job on workers.

EXPLOITATION MASQUERADES AS OPPORTUNITY’

Fasica Berhane Gebrekidan, a DWI member, was illegally fired for starting a union. After she described the psychologically damaging work for a television documentary, people contacted her asking how to be hired. She stated, “That’s how desperate joblessness in Africa has become, so dire that people are willing to risk their wellbeing for a paycheck. When unemployment hits 40% in Africa, people don’t hear warnings, they hear salaries in USD. That’s the calculus of late-stage capitalism: exploitation masquerades as opportunity.”

In the past few years, content moderators and other data workers have organized unions in spite of tech companies’ attempts to stop them. They use the same new internet technology to convey information to the public through websites, media, and newsletters. New social justice organizations like DWI and Foxglove, composed of academics and legal experts, do the same.

Their activism exposes new issues. Moderators describe management pressure to not classify material showing child abuse and trafficking as harmful. Social media content incites violence worldwide, even in wars. A petition before the Kenyan High Court is challenging Meta/Facebook for allowing incitement of violence in the Ethiopian war.

Content moderators demand acknowledgement that their job is hazardous, just as other work can be physically dangerous. Along with higher pay, they want psychological counseling. Some view their work as necessary to protect their communities from harmful images. Others emphasize the trauma inflicted by affluent countries on themselves as part of their communities.

The new computer technology creates new types of work and new forms of oppression. Workers still fight to control their work and how it is performed. They still find community and new solutions by organizing. The work we perceive as performed by computers is often performed by humans. It is important to listen to their experiences and keep up with scientific developments that affect all our lives.

2 thoughts on “Data Workers Fight Back

  1. It’s ridiculous that data workers would have a gag order against discussing the content they have to moderate.
    This article revealed a new angle of global capitalism for me. Tech companies are booming (or bubbling) on the stock market and they go to the global South to employ the most vulnerable workers and pay them cheaply to do the daily grind work to keep social media functioning. And workers are eager to take these jobs to keep up with the prices of basic goods needed to remain fed.

  2. Yes, I’m sad to say I once again learned more about the horrible things people do to other people while writing this. And, also learned how capitalism makes the whole thing even worse. Then, their managers retaliate against the workers and try to stop them from talking. A good thing in all of this is that the workers are organizing and putting Marx’s ideas into practice.

Leave a Reply

Your email address will not be published. Required fields are marked *