what are the levers of the authorities to fight against the dissemination of violent videos on social networks?

The government has pointed to the role of social networks in the dissemination of violent videos after the death of young Nahel in Nanterre.

The Minister of Justice Éric Dupond-Moretti said on Friday June 30 that he wanted to “blow the accounts” of users of social networks who disseminate violent publications. Pointed out for their role in the riots after the death of young Nahel, the platforms were invited by the government to limit the spread of violent images. The Keeper of the Seals even asked the prosecutors for a “firm criminal response” against the authors.

>>> Riots: how they are amplified by false information disseminated on social networks

On Tuesday, July 4, there have not yet been any sanctions, but the Pharos platform, which depends on the Ministry of the Interior, has requested the removal of 210 publications since Nahel’s death. Four young people aged 18 to 21 were also arrested in Charente-Maritime over the weekend after broadcasting calls for violence on the social network Snapchat; all live in small rural communities, which allowed the police to quickly trace them. Three of them are also already known to the police, says the Saintes prosecutor. In this specific case, the gendarmes did not have to collaborate with Snapchat, the accounts of the four young people were not suspended.

The rather vague answers of the platforms

When the perpetrators are not so easily identifiable, prosecutors can request information from social networks to identify them. Platforms should respond in a “reasonable delay“, under penalty of a fine of nearly 4,000 euros. At this stage, we do not know if requisitions of this kind have already been pronounced.

The platforms for their part only give very general answers, without giving any recent figure. Snapchat, for example, indicates that it has strengthened its surveillance system since Nahel’s death, adding that it had already “zero tolerance” for content that incites hatred or violence. At TikTok, it is specified that 40,000 “safety professionals“, security professionals, already track hateful videos on a daily basis.

The networks have the status of host, and not content publishers, so they are not considered responsible for these illicit photos, texts or videos and have no legal obligation to monitor them. Their responsibility is engaged only if they are notified of the presence of such content. But some videos, like on Snapchat, only last 24 hours and disappear before they can even be deleted by the platform.


source site-32