True or false junior answers questions about recommendation algorithms

What is recommended to us on social networks and how does it work? Students ask us questions about recommendation algorithms and we answer them with Arthur Grimonpont, author of “Algocracy, living in the age of algorithms”.

Published


Update


Reading time: 6 min

Recommendation algorithms are programs that select what is most likely to hold our attention on social networks (MASKOT / DIGITAL VISION)

Is it true that algorithms choose what we watch based on what we like? Can algorithms create dependency on an application? Is it true that algorithms push people to watch fake news? Students from the André Derain colleges in Yvelines and those from Eryeux in Ardèche question us. To answer them, Arthur Grimonpont, engineer and head of artificial intelligence issues at Reporter Sans Frontière and author ofAlgocracy, living in the age of algorithmspublished by Actes Sud.

Algorithms don’t just show us what we like

Recommendation algorithms are programs, machines, which select what, at a given moment, is most likely to hold our attention among the billions of contents that exist on the Internet and on social networks. All social networks use them with the aim of capturing as much of the attention of billions of users as possible to convert it into advertising revenue.

Camille asks us if it’s true”that algorithms often choose posts based on what we like and what we watch.

It’s “partially true”, replies Arthur Grimonpont, “since indeed, what you like is likely to retain your attention, but there are also many things that you do not like and which can retain your attention and for an algorithm, it is indifferent if you do not like not, as long as it catches your attention.” To illustrate this, Arthur Grimonpont takes this example, “when you are in a car and you pass by an accident, your attention will be almost irresistibly drawn to the accident and you will look at it and from this type of behavior, a recommendation algorithm would understand that you love watching the road accidents and would recommend a new one every kilometer.“And that’s partly why we find very shocking, violent things on social networks.

Finally, to determine what they will offer you, the algorithms analyze what you watch, the time you spend on a video, or even if you like, comment or share certain content.

Most of the content we watch on social networks comes from algorithm recommendations.

Camille wonders if it’s true “that sometimes algorithms can show popular videos to encourage more views”.

In the vast majority of cases, it’s quite the opposite that happens, the videos that become popular are because the algorithms recommend them to people. explains Arthur Grimonpont who has a concrete example which illustrates this. “On YouTube, if you put together all the time humans spend on YouTube every day, there are 120,000 years of videos that are watched every day and of that time, three quarters are the result of a recommendation from the algorithm” explains Arthur Grimonpont. He is pointing out that “this means that it is indeed the algorithm which chooses what each of us watches and if we go to Tiktok or Instagram, there, almost 100% of the content we watch has been selected by an algorithm.”

Yes, algorithms encourage overconsumption and therefore potentially dependence

Sacha wonders if it is true that an algorithm can lead to overconsumption and Eythan wonders about algorithms and their proportion to be created “an application addiction”.

Arthur Grimonpont confirms that algorithms do encourage overconsumption, because there is a commercial goal behind it, with advertising revenue and studies clearly show that excessive consumption of social networks can lead to addiction, caused in part by the algorithms. He explains to us that we must not forget the interest for these platforms to keep you as long as possible, even to the detriment of our health, as evidenced by this quote from the boss of Netflix,
Reed Hasting to his shareholders:“NOTWe are competing with sleep and we are winning“, which means, explains Arthur Grimonpont, “that even a basic physiological need, sleep without which one is sure to be in poor health, is seen as an obstacle in the pursuit of the economic model of these platforms.”

False information made popular thanks to recommendation algorithms

Axel wonders if it’s true”that algorithms would influence people to watch fake news.”

Arthur Grimonpont explains that there is “full of studies which show that this is the case, not because there are malicious engineers again who have decided to widely recommend false information, but quite simply because everything which is false surprises and which surprise catches our attention.” The head of artificial intelligence issues at Reporter Sans Frontière quotes “a study by MIT researchers which, in 2019, showed that on an exactly identical subject, false information, a tweet containing false information spreads on average six times faster than a tweet not containing false information .”

Recommendation algorithms use your personal data

Eythan asks if it’s true”that algorithms know personal things about Internet users.”

“Yes”, replies Arthur Grimonpont, “because it is the model of these platforms which requires them to know you, and even when you have the impression of not giving them any personal information, they categorize you very finely according to your interests and by what we call statistical inference.“So, it manages to label you with thousands of labels to know your gender, your age, your political preferences, your sexual orientation and things much more precise than that. For example, Arthur Grimonpont explains to us that “researchers showed that Facebook had, among thousands of others, a label ‘breastfeeding your child in public’, so Facebook knew that some women were breastfeeding their children in public and put this label on them among thousands and thousands of others .”


source site-15