Online abuse, intimate images and the toxic internet

25 years ago, a French court ruled in favor of Estelle Hallyday. Intimate photos of this woman who worked as a model had been posted online by an unknown person. The judges concluded that this unauthorized distribution was wrongful and condemned the company which hosted these files placed online by an untraceable user.

The case triggered an outcry, because the decision held responsible the company which had only provided a digital space to an Internet user to place files there. We then invoked the analogy of the hotelier who cannot be held responsible for what happens in the rooms he rents.

The Hallyday affair had major repercussions on the legal framework of these digital spaces which have since become megaplatforms welcoming images, videos and texts from any user. 25 years ago, many people were concerned that companies that host files posted online by users would be held responsible, as were newspapers and radio stations that broadcast illegal content. We insisted that these intermediaries who receive users’ publications should not be subject to the liability regime that prevails for the media: such a status would have placed them in a position to censor to protect themselves from possible prosecution.

To respond to these fears and “promote innovation”, laws which limit the liability of intermediaries have been put in place. In European legislation adopted in the 1990s, States were prohibited from holding intermediary platforms responsible unless it can be demonstrated that they are aware of the illegal nature of the images, texts or videos posted online by a user. .

Quebec did the same in 2001 by adopting an innovative law on the legal framework for information technologies. Article 22 of this law provides that an intermediary such as a social network or a search engine can only be held responsible if it is aware of the illegal nature of the comment, image or video that is found on his site.

In the United States, Congress went even further. In 1996, it passed a law that was interpreted by the courts as providing broad immunity to middlemen. Even today, these sites — which have become mega social media platforms — cannot be held responsible for images and comments once they have been posted online by a third party.

Online abuse

For several years, the harmful consequences of online abuse have been documented. Regarding the non-consensual dissemination of intimate images, several states have adopted laws which, like section 28.1 of the Quebec Act on the protection of personal information in the private sector, require platforms to remove intimate images published without the consent of the people depicted.

But the inadequacy of the laws governing the activities of platforms persists with regard to many other questions. For a quarter of a century, we have been able to realize that the most crucial issue facing intermediary platforms is not the risk of seeing them transform into censors of the images and comments that users disseminate there. The societal risk they induce is rather due to the absence of transparency obligations with regard to their algorithms and other technical processes that they use in order, among other things, to maximize screen time.

This is what motivated around forty American states to undertake a large lawsuit against Facebook, Instagram and TikTok, demanding compensation for the public expenses that they are now obliged to make due to the toxic effects of attendance. social networks, particularly by young people. Ontario school boards and the government of British Columbia have taken steps in the same direction.

These appeals highlight the perverse effects of the business model of intermediary platforms which profit from maximizing users’ time spent online. They promote the data produced by users without being bound by transparency obligations. The delay in recognizing the societal consequences of this business model explains the slowness of States in putting laws in place to impose conditions on online platforms. During these government prevarications, they continue to reap juicy profits by deploying these algorithms about which we know nothing, other than that they optimize their profits.

The limits of the status quo

A quarter of a century after the legal saga of Estelle Hallyday, it is clear that there are limits to the laws that favor Internet platforms. For better or worse, these laws have enabled the rise of the sharing economy and platforms like Uber and Airbnb. The model also enabled the rise of Pornhub, a platform which originally functioned as a YouTube of eroticism and pornography!

Faced with the power of the processes deployed by these companies, sticking to advocating “education” and parental responsibility is dangerously naive. This delays the implementation of laws to ensure that the business models of online platforms adapt to reduce the risks to which populations operating in connected worlds are exposed.

By maintaining the belief that all this is just a matter of education and parental supervision, we are playing into the hands of the Web giants, who are doing everything to prevent the implementation of laws that would impose on them obligations of transparency and of accountability.

To watch on video


source site-42