Content moderators – How much does it cost to clean up the Internet?

“The first video I saw was a live beheading,” Daniel Motaung said Tuesday at a news conference hosted by the board of Super Facebook, a platform-fighting NGO. “Imagine what it would take for a normal person to see this kind of content every day,” continued the young man, who says he suffers from post-traumatic stress disorder.

This South African worked for six months for Sama, a Meta (formerly Facebook) subcontractor in charge of moderating social media content for Eastern and South African countries. Its function: to remove illegal content from the platform for its violence, constituting harassment or transmitting false information. The man is not exactly anonymous: last February, his testimony made the cover of TimesMagazine.

Daniel Motaung and his lawyers say they represent 240 Sama content moderators in the Nairobi, Kenya office. They first protest the opacity of the calls for contracts made by the subcontractor. When hiring, the company mentions “administrative tasks”. Candidates, often of modest origin, are chosen based on their source language so that they can control a wide variety of content posted on the platform. Meta and Sama “recruit moderators through fraudulent and deceptive methods, an abuse of power, taking advantage of the vulnerability of young, poor and desperate candidates,” the lawyers say.

Dignity. His complaint describes “unworthy” working conditions, irregular and insufficient pay, a lack of psychological support, performance pressure and attacks on privacy and dignity. In order to supervise the moderators, Sama had probably hired “wellness coaches”, the lawyers say, but little trained and above all without a commitment to confidentiality with employees. Daniel Motaung was fired after trying to form a union. “It simply came to our notice then. We were told, we are doing you a favor. (…) Take what they give you and close it, ”said the former worker.

“We periodically conduct independent audits to ensure that our partners meet high standards, in line with our expectations.”

A spokesman for Meta, contacted by AFP, assured that the company’s responsibility towards those who examine its content will be taken seriously. “We expect our partners to provide industry-leading salaries, benefits, and support. We encourage moderators to talk about issues when they arise, and we regularly conduct independent audits to make sure our partners work to high standards,” he said. with our expectations, “he said, as Meta is currently reviewing contracts linking the platform to its subcontractors.

Post-traumatic stress. This “Internet cleaners” procedure against Facebook is not the first. In a decision upheld by U.S. justice last September, the company was ordered to pay $ 52 million to 11,250 U.S. content moderators. On 31 May 2022, a minimum of € 1,000 per person was to be paid, with additional compensation for victims of more serious disorders.

This complaint is named after a moderator, Selena Scola, who sued Facebook for a symptom of post-traumatic stress. She was hired in 2016, following the U.S. presidential election when Facebook was criticized for allowing illegal comments to proliferate. During her nine months of experience with the platform, the young woman had regularly seen videos of rapes and murders.

Last November, in front of parliamentarians in many Western countries, Frances Haugen, a former Facebook whistleblower, stressed the crucial role of moderators who speak the language of the countries where they work, which is not always the case. The Digital Services Act (DSA), the text that will regulate platforms within the European Union from next year, is in line with the words of the American whistleblower. It calls for stricter moderation on all platforms, with enough staff. For some, the challenge will be daunting. For example, last December, Twitter employed 1,867 moderators worldwide.

Leave a Comment