Social networks are “ pushers virtual,” says the Premier of Quebec, François Legault. They are configured to maximize the time users spend online. In several circles, we are calling for the implementation of measures to combat the harmful effects caused by the poorly regulated operation of these virtual spaces. Many of the pitfalls observed with regard to frequenting social networks arise from the processes deployed by online platforms in order to enhance the attention of users. This is what States must act on.
In its 2024 report entitled Disruptions on the horizon, the Center of Excellence in Foresight, Policy Horizons Canada, calls the growing difficulty people have in determining what is true and what is not true one of the most important problems of our time. In France, a report submitted to President Emmanuel Macron documents the harmful effects of too much screen viewing, particularly among young people. In Quebec, activists from the Coalition Avenir Québec advocate limiting access to social networks for people under 16 years old.
But we live in a world where the ability to interact in a connected environment is essential. Prohibiting individuals from using these networks is unrealistic and liberticidal. Above all, such measures do not target the source of the problem. To respond to the very real concerns raised by the use of social networks, we must instead act on the processes they deploy in order to maximize user engagement.
Attention capture
The functioning of social networks is based on the capture and valorization of the masses of data that individuals generate through the multiple connected devices that are now part of their daily lives. Once captured, this data is compiled in order to measure what mobilizes individuals’ attention. Powerful information processing processes are used to maximize the engagement of individuals online and thus make this attention profitable in various ways, which can now be quantified and analyzed.
Increasingly, we are seeing the deleterious effects of commercial strategies aimed at maximizing young people’s engagement in social media spaces. We highlight the risks of manipulation and the abuses that can result from the poorly supervised use of technologies such as artificial intelligence. The potential of these technologies to erase the ability to distinguish what is true from what is artificial threatens the essential balances of democratic societies.
To prevent the misuse of such massive data processing technologies to mislead or manipulate people, laws are needed. Rules are needed to protect the integrity of the attention of individuals who use social networks. Faced with the generalization of platforms and the multiplication of targeting processes likely to short-circuit individuals’ attentional defenses, it is important to upgrade the legal framework protecting freedom of expression and especially the integrity of attention.
The protection of the integrity of attention has long been the basis of a vast set of rules such as those which ensure the integrity of messages, the integrity and fairness of electoral advertisements, political advertisements or commercial advertisements.
Social media can be dangerous for both children and adults. Rather than leaving parents and teachers responsible for policing their use, we must act on their configurations which are at the source of the abuses suffered by users.
One path to follow is that traced by the European Commission. She asked very large platforms and search engines with more than 45 million active users in the European Union, such as TikTok, Facebook, Google or moment of the European elections which will take place in a few days. These measures target in particular content generated by artificial intelligence such as deepfakes or misinformation. The Commission asks very large platforms to monitor their configurations to reduce the risks of spreading electoral disinformation.
These are the types of measures that other states should hurry to implement to ensure that the integrity of the attention of individuals operating in online spaces is protected.
To effectively protect young and old from the harmful consequences of social networks, it is necessary to regulate the algorithmic processes which are at the source of the abuses that we fear. Rather than exhausting ourselves by prohibiting the use of connected devices and the use of social networks, we must act on the mechanisms by which online platforms transform the attention that individuals pay to what they see or hear into dollars .