Bill C-63, tabled last week by federal Justice Minister Arif Virani, proposes to provide Canada with an Online Harms Act. This law will impose targeted obligations on social networks in order to fight against some of the worst scourges plaguing the Internet.
Seven categories of content are targeted: content that sexually victimizes a child or re-victimizes a survivor; content that incites violent extremism or terrorism; content that incites violence; content that foments hatred; content intended to intimidate a child; intimate content communicated in a non-consensual manner (including hyperfaking); and finally, content pushing a child to harm themselves. It is not just any malicious remarks that are targeted.
For example, content fomenting hatred that is prohibited by the bill is that which “expresses hatred towards an individual or group of individuals or which manifests defamation towards them and which, taking into account the context in which it is communicated, is likely to foment hatred or defamation of an individual or a group of individuals” based on a ground of discrimination based on race, origin national or ethnic background, color, religion, age, sex, sexual orientation, gender identity or expression, marital status, marital status, genetic characteristics, disability or the state of a pardoned person.
To dispel ambiguity, the bill specifies “that the content does not express hatred or manifest defamation for the sole reason that it expresses disdain or aversion or that it discredits, humiliates, injures or insulted “.
This very restrictive definition of hate speech prohibited by law should meet the requirements of compatibility with freedom of expression, as recognized by the courts. The Canadian Human Rights Commission, empowered to study complaints on this subject, will not have the discretion to designate as hateful any comments perceived as offensive.
Likewise, to reduce confusion on these issues, it would be healthy for us to try to adjust the current vocabulary which, too often, wrongly designates hurtful or offensive comments that do not meet the threshold as hate speech. required by law to qualify as hateful. This is an imperative for language precision.
Obligations for social media
The bill imposes obligations on major social media. They will be required to act responsibly and implement measures to mitigate the risk of users being exposed to harmful content. Platforms will have to submit digital security plans that will allow users to report harmful content and block malicious users.
Social networks will also have to ensure the protection of children by integrating safer design features into their configurations, particularly age-appropriate ones. Likewise, the bill proposes to impose on social networks an obligation to delete content that sexually victimizes a child or re-victimizes a survivor, as well as intimate content communicated non-consensually.
Bill C-63 sets up specialized bodies to ensure that the operation of social networks meets the requirements of the law. It proposes to establish the Canadian Digital Security Commission to ensure the execution and control of application of the law and it establishes the position of digital security ombudsman. This ombudsman will have the mandate to provide support to social media users and promote the public interest in online safety.
The main deficiency of Bill C-63 is that it leaves social networks a great deal of leeway in judging whether reported content is actually contrary to the law. In a democracy, the decision between legal and illegal is up to judges, not commercial companies. It is therefore deplorable that we have failed to provide that cyber judges will be able to resolve the disagreements which are sure to arise regarding the actually hateful or contrary to the law nature of the content reported by users.
It is to be hoped that in the next step, the federal legislator as well as the provinces will put in place the necessary mechanisms to ensure that the courts can intervene quickly online with regard to the conflict situations that take place there. We do not ensure a credible democratic separation between legal and illegal content by relying on the goodwill of commercial companies.
But overall, Bill C-63 is a step in the right direction. It is part of the trend of legislation in other democratic states to reduce certain scourges plaguing the Internet. We can no longer just deplore abusive behavior online. We must go beyond wishful thinking and naive calls for “education” and vigilance of victims.