Content moderators are organizing against Big Tech

Content moderators who comb through harmful material uploaded to online platforms have formed a global trade union alliance in a bid to improve working conditions. The Global Trade Union Alliance of Content Moderators (GTUACM) announced today in Nairobi, Kenya, says it aims to “hold Big Tech responsible” for failing to address workers’ issues like low […]

Apr 30, 2025 - 13:06
 0
Content moderators are organizing against Big Tech

Content moderators who comb through harmful material uploaded to online platforms have formed a global trade union alliance in a bid to improve working conditions. The Global Trade Union Alliance of Content Moderators (GTUACM) announced today in Nairobi, Kenya, says it aims to “hold Big Tech responsible” for failing to address workers’ issues like low wages, trauma, and lack of union representation across the industry.

Companies like Meta, Bytedance, and Alphabet often outsource content moderation on their platforms to contract workers. The job requires these workers to analyze and flag violent videos, hate speech, child-abuse imagery, and other harmful content. GTUACM says that many moderators in the industry experience “depression, post-traumatic stress disorder, suicidal ideation, and severe mental health consequences” due to being exposed to such content without adequate support. Workers are also often faced with unrealistic performance targets, employment uncertainties, and fear of being punished for speaking out about the issues.

“The pressure to review thousands of horrific videos each day – beheadings, child abuse, torture – takes a devastating toll on our mental health, but it’s not the only source of strain. Precarious contracts and constant surveillance at work add more stress,” said Michał Szmagaj, a former Meta content moderator who is now helping workers to unionize in Poland. “We need stable employment, fair treatment, and real access to mental health support during work hours.”

GTUACM says it aims to provide a global platform to bargain with tech companies, alongside coordinating collective campaigns and researching occupational health. Content moderators will be part of the alliance through their trade unions, with unions in Ghana, Kenya, Turkey, Poland, Colombia, Portugal, Morocco, Tunisia, and the Philippines currently forming the alliance. Unions from other countries, including Ireland and Germany, are also expected to join in the near future.

The US is notably absent from that list, but that doesn’t mean US unions won’t be involved. Benjamin Parton, Head of UNI Global Union’s ICTS Sector told The Verge that “not all unions who are supporting content moderator organizing were able to attend the event, but we work closely with our member unions in the United States, such as the CWA, to demand justice in the Big Tech supply chain.“

“Kenya has become a global hub for [content] moderation, and we welcome investors to Kenya to invest in this sector, but it must not be against the health of workers in this country,” said Benson Okwaro, the General Secretary of the Communication Workers Union of Kenya (COWU). “That is why we are organizing on the ground and alongside unions worldwide. Together we are sending a clear message to investors in this sector, including Meta, TikTok, Alphabet, and Amazon that moderators everywhere will no longer stay silent while platforms make profit from their pain.”

Meta is notably being sued by former content moderators in Ghana and Kenya over psychological distress inflicted by the contracted role. A group of former content moderators who flagged graphic and violent videos on TikTok has also filed a lawsuit against their former contractor, Telus Digital, over claims that they were fired for trying to unionize and improve their working conditions.

“The content we see doesn’t just disappear at the end of a shift. It haunts our sleep and leaves permanent emotional scars,” Özlem, a former Telus worker, said in a statement to the UNI global union. “When we raise it with our managers, they say these are the conditions TikTok, the client, requires. When we stand up for better conditions at our jobs, our coworkers get fired.”

We have reached out to Meta, TikTok, and Google for comment regarding the GTUACM formation.

“Companies like Facebook and TikTok can’t keep hiding behind outsourcing to duck responsibility for the harm they help create,” said Christy Hoffman, General Secretary of UNI Global Union. “This work can – and must – be safer and sustainable. That means living wages, long-term employment contracts, humane production standards, and a real voice for workers.”