Future Online Harms Bill | Disinformation should be included, expert group says

(Ottawa) Misinformation, including “deepfake” videos and software spreading falsehoods, should be included as part of a future online harms bill, according to a panel of experts appointed by the Minister of Patrimony Pablo Rodriguez to help shape future legislation.

Posted at 8:26

Mary Woolf
The Canadian Press

“Deepfakes” are fake videos or photos that use deep learning technology, which creates very realistic counterfeit images.

Members of the expert panel, including Bernie Farber of the Canada Anti-Hate Network and Lianna McDonald of the Canadian Center for Child Protection, said the law imposes an obligation on tech giants to combat the spread of fake news and fake videos.

Some have suggested that Canada should take inspiration from the European Union (EU) Digital Services Act, which allows for stronger action to counter disinformation in times of crisis – for example during elections, conflicts international and public health emergencies.

They said the EU measure was linked to Russia’s attempts to spread false claims to justify the invasion of Ukraine.

Public Security Minister Marco Mendicino said in an interview that the technology was now so sophisticated that some doctored content and images were “virtually indistinguishable” from the real content, making it very difficult for people to make the difference.

He said a “whole-of-government approach” spanning multiple departments was needed to combat the spread of misinformation in Canada.

“We are at a crucial moment in our public discourse. We are seeing an increasing amount of misinformation and misinformation based on extremist ideology,” he said.

An academic analysis of more than six million tweets, their shares and their origins revealed that Canada is being targeted by Russia to influence public opinion there.

Research conducted this month by the University of Calgary’s School of Public Policy found that a large number of tweets and retweets about the war in Ukraine can be tied to Russia and China, and that Many of these tweets expressing pro-Russian sentiment are linked to the United States.

Ministers announced their intention to introduce an Online Harms Bill which would tackle online abuse – including racial slurs, anti-Semitism and offensive statements aimed at members of the LGBTQ community. It follows an earlier online hate bill released just before last year’s federal election. The project did not become law.

The expert panel, which also includes law and politics professors from across the country, said a bill should not only tackle online abuse, including child abuse, but also address accounts for false and misleading information online.

This could include coordinated disinformation campaigns “used to create, spread and amplify disinformation,” including the use of agent software that performs repetitive tasks and their networks. This also includes inauthentic and “deepfake” accounts.

Some experts on the panel said the bill should also tackle false advertising, misleading political communications and content that contributes to “unrealistic body image”.

The group said platforms would have a “duty to act” to tackle “harmful content online, including misinformation, by conducting risk assessments of content that may cause significant physical or psychological harm to individuals”. .

Some experts on the panel cautioned that counter-disinformation measures need to be crafted carefully so they cannot be misused by governments to justify censorship of journalism or criticism.

Their warning was echoed by Emmett Macfarlane, a constitutional law expert at the University of Waterloo.

“There are always valid concerns about the potential for overbreadth and unintended consequences arising from this type of law. Our existing criminal laws on hate speech and obscenity have resulted in material being unjustly restricted or blocked at the border, for example,” he said.

According to the 12 experts on the panel, who have just completed their work, misinformation and false messages could pose higher risks for children.

They recommended that the bill place strict requirements on social media companies and other platforms to remove content that depicts or promotes child abuse and exploitation.

A few of the members criticized the platforms for not removing this content immediately, saying that “the current performance of online services in removing child sexual exploitation material is unacceptable”.

The group criticized platforms in general for reporting the percentage of harmful content they remove, but not how long it took to remove it.

Minister Rodriguez thanked the panel for completing its discussions last week, saying “their advice is essential in developing a legislative and regulatory framework to address this complex issue and help create a safe online space that protects all Canadians”.

“Free speech is at the heart of everything we do, and Canadians should be able to express themselves freely and openly without fear of harm online and our government is committed to taking the time to get it right,” he promised.

The Minister also thanked the Citizens’ Assembly, a group of 45 Canadians studying the impact of digital technology on democracy, for its guidance. At a conference last week, the group stressed the importance of tackling the spread of misinformation online, saying it can manipulate public opinion.


source site-55