Disinformation: the European Union puts pressure on Facebook and Instagram

The European Commission opened an investigation on Tuesday against the social networks Facebook and Instagram, suspected of not respecting their obligations in the fight against disinformation five weeks before the European elections.

Leaders have recently expressed concern about possible manipulation of public opinion by Russia.

“This Commission has put in place tools to protect European citizens from disinformation and manipulation by third countries,” underlined President Ursula von der Leyen. “If we suspect a violation of the rules, we take action. This is always true, but particularly during elections.”

Brussels has listed four main grievances. The first concerns “insufficient” moderation of advertisements by Meta. The Commission singles out the dissemination of a large number of advertisements “which present a risk for the electoral processes”, evoking “advertising campaigns linked to manipulation of information from abroad”.

Meta has not directly commented on the proceedings. “We have a well-established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details on this work,” responded a spokesperson.

The Brussels executive also criticizes the fact that Meta reduces the visibility of political content in the recommendation systems of Instagram and Facebook, a practice contrary to the transparency obligations of the DSA.

The Commission also suspects that the mechanism put in place by Meta to allow users to report illegal content does not comply with the regulations. It would not be easy enough to access and use.

Finally, Brussels criticizes Meta for its plan to remove a tool considered essential for identifying and analyzing disinformation on Facebook and Instagram, without an adequate replacement solution.

Possible sanctions

Meta announced in early April that its “CrowdTangle” tool would no longer be available after August 14, to the great dismay of many researchers and journalists who use it to monitor in real time the spread of conspiracy theories, incitements to violence or manipulation campaigns led from abroad.

“In the wake of the European elections which will take place from June 6 to 9, 2024 and a series of other elections which will take place in the Member States”, this deletion could reduce “the capacities for monitoring false information”, he said. worries the Commission.

It asks Meta to inform it, within five days, of “the corrective measures” taken to ensure public control of the content broadcast, under penalty of possible sanctions.

This is the fifth formal investigation launched by Brussels under the new Digital Services Regulation (DSA) which came into force last year to combat illegal content online.

The Commission has already opened two investigations targeting TikTok, one of which last week pushed this subsidiary of Chinese ByteDance to suspend a controversial function of its new TikTok Lite application which rewards users for time spent in front of screens. This functionality is suspected of creating risks of addiction among adolescents.

A procedure was launched at the beginning of March against the Chinese online commerce giant AliExpress, a subsidiary of Alibaba, suspected of not sufficiently combating the sale of dangerous products such as fake medicines.

The first formal investigation already focused on risks linked to disinformation. It was initiated on December 18 against the social network X (formerly Twitter) for alleged failures in content moderation and transparency.

The regulation on digital services has applied since the end of August to the most powerful online platforms such as X, TikTok as well as the main services of Meta (Facebook, Instagram), Apple, Google, Microsoft or Amazon.

In total, 23 very large Internet players, including three pornographic sites (Pornhub, Stripchat and XVideos), were placed under the direct supervision of the European Commission, which recruited more than a hundred experts in Brussels to take charge. his new role as digital policeman.

Violators face fines of up to 6% of their global annual turnover, or even a ban on operating in Europe in the event of serious and repeated violations.

AFP participates in more than 26 languages ​​in a fact-checking program developed by Facebook, which pays more than 80 media outlets around the world to use their fact checks on its platform, on WhatsApp and on Instagram.

To watch on video


source site-46