[Chronique] The cybermercenaries of “fake news”

Disinformation is a fuel of war. Today, it is sustained by unprecedented abilities to spread false and misleading information. The war in Ukraine and the pandemic have exacerbated the problems of misinformation and disinformation. The libertarian utopia that has so marked the imagination of the Internet is now supplanted by a brutal observation: cyberspace is a battlefield. A virtual territory where the extraordinary technical possibilities can be diverted in the service of greed and other quirks of humanity.

THE fake news are big business. An international consortium made up of a hundred investigative journalists belonging to 23 media has shed light on certain mechanisms of cyber warfare. Two media members of the consortium, the newspaper The world and Radio France, report on the actions of “Team Jorge”, a secret group led by a leader known by the pseudonym of “Jorge”. Based in Israel, the group is made up of former military and intelligence operatives and other experts in social media, psychological warfare, military issues or financial information. In short, they are cybermercenaries.

Team Jorge sells its services to those who can afford them. But the company would refuse to work on US national policy, Russia and Israel. The firm has, however, participated in campaigns to promote nuclear power in the State of California. She would have intervened in “support for the Senegalese president, Macky Sall, for his re-election in 2019” as well as on the “denigration of a Swiss whistleblower”. The group is said to have taken part in some thirty political campaigns.

The company has developed software to create fake profiles and activate them on social networks. The software produces very realistic profiles on social networks, which can thus escape detection from the moderation services of major platforms. These fake profiles exist on Twitter, Facebook and Instagram, and write comments under YouTube videos. The investigation uncovered other dubious operations exploiting network vulnerabilities.

undermine trust

Disinformation and misinformation have consequences. A report by a committee of experts from the Council of Canadian Academies (CCA), released on January 26, documents how misinformation can undermine trust in our institutions and distort political priorities. This delays action on crucial issues, such as climate change. Misinformation about scientific or health issues deepen divisions within society. Its financial and human impacts are very real. Alex Himelfarb, who chaired the Expert Committee, explained that “the uncontrolled spread of erroneous scientific and health information makes individuals and society vulnerable to exploitation and threatens our ability to work together to address common challenges”.

Faced with such scourges, Normand Baillargeon is right to advocate the development of reflexes and good practices by individuals. But the problems of disinformation and misinformation are structural and also require collective action. Alas, to date, states have done very little to prevent the exploitation of network vulnerabilities by those bent on destabilizing democratic processes to serve their interests.

A report by the Public Policy Forum published in 2018 identified a series of policy avenues to deal with the rapid emergence of digital threats against democratic institutions and social cohesion. For the restoration of trust and the integrity of the information dissemination process, the report mentions the need to force the implementation at the level of the major platforms of measures ensuring
advertising transparency. Likewise, algorithms should be subject to regular audits by independent authorities. The results of these expertises should be accessible to the public. The report also calls for international rules to ensure that digital companies apply fundamental rights principles to the management of their sites and the treatment of fraudulent content.

Several disinformation actions are carried out through the use of automated devices known as bots. States must work together to impose the establishment of effective and publicly accountable mechanisms to determine which technical processes are used to support fraudulent operations.

Protecting people from disinformation and misinformation requires keeping information spaces open and safe. In this spirit, it is important to support non-governmental actors capable of counterbalancing the dominant actors. Similarly, we must strengthen public service media, such as Radio-Canada and Télé-Québec, without forgetting the community media which, for half a century, have been providing essential local information services here.

To see in video


source site-46

Latest