Facebook reportedly froze in the face of vaccine misinformation

In March, as allegations of the danger and ineffectiveness of coronavirus vaccines swept across social media and undermined the vaccination campaign, Facebook employees believed they had found a solution.

By subtly changing the way vaccines were categorized on user news feed, company researchers realized they could stem disinformation about COVID-19 vaccines and instead offer users messages from sources. credible, like the World Health Organization.

“Given these results, I imagine we’ll move forward as soon as possible,” a Facebook employee wrote in March, reacting to an internal memo about the study.

Instead, Facebook chose to shelve some of the study’s suggestions. Further changes were not made until April.

When another Facebook researcher suggested in March turning off comments associated with vaccination posts until the company was better equipped to neutralize the anti-vaccination information lurking there, the proposal was ignored. .

“These people are selling fear and anger”

Its detractors accuse Facebook of having delayed in acting for fear that it will harm its profits.

“Why don’t we remove the comments?” Because participation is all that matters, said Imran Ahmed, boss of the Center for Countering Digital Hate, a web monitoring group. It generates attention, and attention is eyes, and eyes are advertising revenue. “

In a statement sent by email, Facebook retorts to have made “considerable progress” this year to fight vaccine disinformation on the thread of users.

These internal Facebook discussions were notably brought to light thanks to whistleblower Frances Haugen, a former manager of the social network.

The “Facebook Papers” show that in the midst of the COVID-19 pandemic, Facebook carefully studied how its platforms were used to spread misinformation about vaccines. They also reveal that employees have regularly suggested solutions to combat this disinformation on the site, but without success.

This inactivity raises questions as to whether Facebook has prioritized controversy and division over the health of its users.

“These people are selling fear and anger,” said Roger McNamee, one of Facebook’s early investors who is now one of its biggest detractors. This is not an isolated case. It’s a business model. ”

Facebook typically ranks posts based on the reactions they generate, the number of “likes” or comments they elicit, or the number of times they are shared. This system works well for harmless topics like recipes or videos of dancing dogs, but internal Facebook documents show that for thorny topics like vaccination, this strategy only stirs up dissent, polarization and doubt. .

A conclusive test

To study how to combat disinformation, Facebook researchers altered the classification of messages from more than 6,000 users in the United States, Mexico, Brazil and the Philippines. Instead of presenting messages to them based on the reactions generated, they were presented with messages chosen for their reliability.

The results were striking: a nearly 12% drop in content with false claims and an 8% increase in content from serious sources like the WHO or the US Centers for Disease Control and Prevention. .

Company employees were euphoric, according to internal documents.

“Is there a reason we wouldn’t do this?” asked an employee.

Facebook claims to have implemented many of the study’s findings – but only after a month, a delay that occurred at a crucial time in the vaccination campaign.

A spokesperson said in a statement that the internal documents “do not represent the considerable progress we have made since that time in providing reliable information about COVID-19 and in neutralizing misinformation regarding COVID and the vaccination”.

The company adds that it took time to study and roll out the changes.

The urgency to act was, however, obvious. By this time, the United States was offering the vaccine to the most vulnerable, only 10% of the population had received a first dose, and a third of Americans were considering not being vaccinated at all.

A misunderstood problem

Even so, Facebook employees admitted that they did not “realize” how virulent the anti-vaccination messages were on their site. Research conducted by the company in February found that up to 60% of comments generated by posts about vaccination were against it.

Worse yet, employees have admitted not knowing how to find or remove these messages.

“Our ability to detect (vaccine hesitation) in comments is poor in English, and essentially non-existent elsewhere,” said another internal document dated March 2.

Employees have offered to disable comments for all vaccination messages until the company finds a solution. The suggestion was rated as “very interesting”, but was never implemented.

Instead, Facebook boss Mark Zuckerberg announced on March 15 that the company would begin labeling posts that described vaccines as safe.

This allowed Facebook to continue to generate reactions – and therefore profits – from anti-vaccination comments, said Mr. Ahmed of the Center for Countering Digital Hate.

“People have received disinformation and died because of the decisions made by Facebook,” he denounced. There should be a murder investigation. ”

Watch video


source site

Latest