[Opinion] Digital Sisyphus | The duty


After announcing the dismissal of nearly half of its employees worldwide, Twitter, bought by Elon Musk, has just dismissed more than 4,000 “content moderators” responsible for cleaning the Web every day.

They wade silently through the sewers of the Internet, seeking day and night to clean its pipes clogged with videos of all kinds of atrocities and hate speech. With a simple click, the “garbage collectors” of the Web clean up the delusions of a certain humanity. They would be 100,000 in the world to clean up the platforms of Facebook, Google, YouTube, Twitter and Co.

Since the beginning of the year, certain invisible hands of these “content moderators” have been waving their fists in Dublin, the Irish capital that has become the Silicon Valley of Europe. Victims of post-traumatic stress disorder (PTSD), around 30 of them are suing Facebook for failing to provide a healthy and safe work environment.

Singing karaoke can help them clean all the dirt from the web, believes Mark Zuckerberg’s social network. But then, said one of her employees, “honestly, you don’t always want to sing, after seeing someone in pieces”.

The testimony of 27-year-old Isabella Plunkett before a committee of the Oireachtas (Irish Parliament) was the subject of Dublin media on May 12, 2021. Two months later, more than 200 Irish moderators wrote a letter to Mark Zuckerberg, reminding the big boss of the Californian giant that, without their work (paid an average of twenty dollars an hour), his empire would collapse.

Content moderator horror stories are far from new. Journalistic investigations of Wired (2014) and The Verge (2019) have already lifted the veil on these shadow censors. What is new are the legal proceedings.

The first took place in California in September 2018. The lawsuit was filed on behalf of Selena Scola, a former moderator claiming to have developed PTSD from watching violent images for nine months. Two years later, in May 2020, Facebook reached an amicable settlement with its 11,000 American moderators, promising to pay them $52 million in damages.

Without really acknowledging the existence of mental and psychological sequelae associated with this activity, the social network agreed to provide them with support sessions with therapists and better tools to improve their working conditions.

Facebook is a “money pump” for the Irish economy, with 3,000 full-time employees and as many in outsourcing. One in eight Irish people work for a multinational company attracted by one of the lowest tax rates in the European Union.

The green Erin has also become the Eldorado of biotechnologies and IT. No wonder it’s on Celtic tiger ground that Facebook is in legal trouble. The heavyweight of the Internet may repeat that it is concerned about the well-being of its moderators, but that did not convince the thirty of them to go to the front by suing Facebook because of PTSD and trauma of all sorts.

Two moderator profiles

A “click worker” (click worker) receives two weeks of training before viewing several hundred messages and images every day for seven hours, and he would have less than ten seconds to throw or not throw each document in the trash.

There would be two profiles of moderators: the fast ones, but who make more errors, and the “slow ones”, who don’t. Facebook would rely on both.

Photos showing scenes with a sexual connotation, videos of murder, suicide or self-harm and hateful statements that do not correspond to the “community standards” of Facebook, which would employ 15,000 “ content moderators ” in the world.

Sometimes there are errors in judgement. Thus, the photo of Phan Thi Kim Phuc, the naked girl burned with napalm during the Vietnam War had been deleted. The reason ? “According to our guidelines, the genitals of minors are to be avoided,” Facebook said, before reversing its decision in 2016 to censor the famous snapshot of Associated Press photojournalist Nick Ut, who won a Pulitzer. .

There is also the risk of excessive moderation and being seen as censors. Keep or throw away? That is the question… Stay cool and neutral. Still.

First major ethnographic study

In recent years, Facebook has reportedly spent half a billion dollars hiring moderators, still outsourced, in addition to the algorithm that controls its pages.

In the United States and Ireland, they are often graduates. “You must be very cultured to be a good moderator, many have studied literature, economics, history, sometimes in prestigious universities”, recalls Sarah T. Roberts, author of Behind the Screenthe result of eight years of research on moderators, the first major ethnographic study of these ” cleaners “.

The American researcher opens the black box of “commercial content moderation” by showing how platforms do everything possible to protect their brand image “at any time of day” against all “disturbing images”.

The digital giants do not like to talk about the dirty work of moderators, strongly framed by imperatives of productivity.

The lawsuit in California and the one in progress in Dublin, in which we find Spanish and German moderators, also bring to light what is happening on the B side of the platforms. The invisible side in which the moderators work hard to keep the A side as clean as possible. A perfect portrait of Janus.

But now, maintaining a sanitized platform is often accompanied by post-traumatic stress syndrome. No wonder a few digital Sisyphus are now seeking to ease their daily burden in the courts of Dublin.

To see in video


source site-39