Stuck by the algorithm: social media bias worries

Do you feel like you always see the same content on social networks? Normal: they are pushed by algorithms, codes that prioritize what these platforms offer according to the user, at the risk of locking the latter in a bubble.

The Assises du journalisme de Tours devoted a conference on Thursday to these “information bubbles”, “unique menus” imposed on us by the algorithms of YouTube, Facebook or Twitter.

A major challenge for the future, since young people are getting more and more information via social networks. According to the Kantar – La Croix barometer published in January, these platforms are the second source of information for French people aged 18 to 24, behind television news.

Globally, while Facebook remains the most widely used social network globally for getting information, young people are turning massively to image-based apps, like TikTok and Instagram, according to the Reuters Institute’s 2022 report.

All of these networks operate according to their own algorithm, a sort of computer recipe that defines what each user sees based on their profile and browsing history.

“Each time we like, retweet, comment on content, we signal to the machine: + that’s interesting, that makes me react, that makes me stay connected +”, explained Mathilde Saliou, journalist of the specialized media Next INpact.

Subsequently, it is this type of content that is more easily offered to the user.

Because “that’s how social networking platforms are built,” continued Mathilde Saliou. Their “most frequent business model” is “to earn money by showing advertising”, which implies that the user “stays connected as long as possible”.

Enough to lock it in a “filter bubble”, a concept created by the American Eli Pariser in 2011?

The real existence of these bubbles “does not make consensus”, tempered Mathilde Saliou. According to her, it is often an “effect that you can get out of” by taking the initiative to go and see other content.

On the other hand, “there are concerns about certain specific effects” which can lead the user into “spirals of radicalization”.

Thus, starting to watch conspiratorial videos on YouTube claiming, for example, that the Earth is flat exposes you to being bombarded with similar videos.

In a lighter mode, Xavier Eutrope, a journalist with the INA media review, recounted the case of a friend who found herself overwhelmed by videos of “Turkish masons” on TikTok, without understanding the reason. .

Reason for additional concern: the content most likely to keep the user on the platform is often the most divisive and controversial.

“Emotions sell, create commitment, from simple clicks to likes, comments and shares”, “especially” when they are “negative”, noted Cyrille Frank, of the digital consulting agency CosaVostra.

These “algorithms of negativity” are a “danger”, in particular for young people, who “see the info on social networks”, was alarmed David Medioni, of the Jean-Jaurès Foundation, during another debate. of the Assizes.

With the academic Guenaëlle Gault, he has just released an essay entitled “Quand l’info épuise”, devoted to the “informational fatigue” which many French people say they suffer from when faced with an overflow of information.

Finally, all observers point to the opacity of the algorithms, the mode of operation of which is a secret jealously guarded by the platforms.

The DSA, a new European regulation on digital services which is due to apply soon, provides that States have access to the algorithms of major platforms. But we do not know to what extent they will agree to comply.

A few weeks ago, the new owner of Twitter, the controversial billionaire Elon Musk, assured that he would make the network’s algorithm public. A promise that has remained a dead letter until now.

For Cyrille Frank, “it is desirable for the citizen to understand what sauce he will be eaten” on social networks. After which, “everyone makes their own choices”.


source site-64