Online Harms Bill: the good, the balanced and the alarming

This preliminary response was originally published on March 29, 2024 in PEN Canada’s monthly newsletter, as part of the Grace Wescott’s President’s Message. Subscribe to get monthly updates sent to your inbox.


The long-awaited online harms bill was finally introduced in Parliament on February 26 as Bill C-63. We can anticipate a call for public submissions after the bill receives second reading and is referred to a standing committee for review. PEN Canada intends to participate in that process.  We are currently reviewing the bill internally. Our perspective is concern for freedom of expression. The comments below are preliminary at this point and strictly my own. That said, certain matters concerning the five parts of the bill seem pretty clear: 

Part 1 – the Online Harms Act: This proposed legislation looks pretty good, with some qualifications, in particular regarding the extensive powers of the Digital Safety Commission.

Part 2 – amendments to the Criminal Code concerning hate speech are problematic, not to say alarming. These amendments increase the maximum penalty for incitement to hate to five years from two, and for incitement to genocide, to life imprisonment. It creates a new, free-standing crime where federal offences are motivated by hate, where now a finding of hate motivations simply is considered an aggravating factor that informs sentencing. Perhaps most concerning, the Bill introduces a recognizance or ‘peace bond’ provision that permits the imposing of restrictions on a person where there is a fear on reasonable grounds that that person will in future commit a hate crime. These amendments are seriously problematic, and in my view this part is a candidate for removal from the bill.

Part 3 – amendments to the Canada Human Rights Act to reintroduce an offence of discriminatory communication of hate speech. This is also seriously problematic and a candidate for removal.

Parts 4 and 5 – Part 4, amendments to An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service appears uncontroversial on first analysis. Part 5 is a list of coordinating amendments dependent on passage of the prior parts of the bill.

That’s pretty brief. Let me elaborate on parts 1 through 3.

Part 1 – the Online Harms Act

The first part of the bill is encouraging. It lays out the new Online Harms Act, whose express purpose is to “promote the online safety of persons in Canada”, to mitigate the risk that they will be exposed to harmful content, and to reduce harms caused by such content so as to enable them to participate fully in public discourse, while respecting their freedom of expression. A particularly strong focus is protecting the physical and mental health of children, and providing remedies for the non-consensual communication of intimate images. The proposed Act is intended to contribute to the development of standards of online safety, and to ensure online operators are transparent and accountable for their duties under the Act.

So far, so good.

The Online Harms Act is tightly focused on seven kinds of defined harms. It places responsibilities and duties on “regulated operators” — that is, social media services of a size above a given threshold — to mitigate harm to their users. This is to be overseen by a digital safety governance structure consisting of a Commission, an Ombudsperson, and an Office to support the first two.  

The seven kinds of harmful content addressed by the Bill are:

    • Intimate content communicated without consent (including deepfakes)
    • Content that sexually victimizes a child or revictimizes a survivor
    • Content that induces a child to harm themselves
    • Content used to bully a child
    • Content that foments hatred
    • Content that incites violence; and 
    • Content that incites violent extremism or terrorism

You will note that many kinds of deeply unpleasant online content – misinformation, disinformation, harassment, targeted algorithms – are not the subject of the bill. By way of contrast, the EU Digital Services Act, considered by many a model, applies to a broader range of content. With Bill C-63, many disturbing and widespread problems having a deleterious effect on freedom of expression online remain unaddressed. It must be said, however, that how to address these concerns without itself imposing undue impact on freedom of expression is a conundrum not susceptible to an easy solution.

Bill C-63’s proposed Online Harms Act imposes three broad duties on regulated operators: 

    1. a duty to act responsibly, taking “measures that are adequate to mitigate the risk that users of the service will be exposed to harmful content on the service.” Such measures would include requiring a digital safety plan, the inclusion of tools and processes to enable users to easily flag harmful content and to block other users from communicating with them, a mandated content reporting system, the opportunity to speak with a human representative, and an obligation to attempt to identify when a bot is generating harmful content. 
    2. a duty to make certain content inaccessible (take down) content that victimizes or revictimizes a survivor, or intimate content communicated without consent, within 24 hours of report of such content by users, subject to appeals by both those posting content and those reporting it; and
    3. a duty to protect children by building age appropriate design features into their services, and implementing industry standard and regulator specified features, as provided for by regulation.

These duties do not apply to private messaging. And nothing in the Act requires a regulated operator to proactively search content on its service in order to identify harmful content. On balance, that’s good.

The proposed Act sets out principles and duties, but concerningly, leaves their definition and regulation in the hands of a newly-created three to five person Digital Safety Commission, which is given correspondingly broad powers. As if to recognize the extensive scope of those powers, the Act provides that the Commission, “when making regulations and issuing guidelines, codes of conduct and other documents, must take into account freedom of expression, equality rights and privacy rights”, among other matters. Much of what might have been defined in the Act is left to the forthcoming Commission for their definition. And the Act does not provide for oversight of the Commission itself. 

Hopefully, these concerns and several others can be addressed at the public consultation stage after second reading, by the appointed Parliamentary standing committee. Our concerns about the proposed amendments to the Criminal Code in Part 2 of the Bill and the Canada Human Rights Act on hate speech in Part 3, are another matter.

Part 2 – Criminal Code Amendments

The second part of Bill C-63 amends the Criminal Code by increasing the maximum penalty for the existing crimes of incitement of hate from two years to five years and of advocating or promoting genocide from five years to a whopping life imprisonment. A penalty of life imprisonment for effectively a speech crime is alarming. I take the view that the penalty of life imprisonment for a speech crime is far too extreme, however nasty the speech. 

The Bill also updates the definition of “hatred” consistent with 2023 Whatcott decision of the Supreme Court of Canada, intended to express the extreme nature of the term. This is fair enough, though it seems impossible to remove the element of subjectivity in the application of the definition despite the honest attempt in the language.  

More seriously yet, this part of the Bill creates a new, stand-alone hate crime, also punishable by life in prison, when an offence under any federal law is found to have been motivated by hatred based on a ground of prohibited discrimination. This hate crime would piggyback on an underlying offence which could itself be relatively inconsequential and very possibly carry a much lighter penalty. Currently, where hate motivation is established in a criminal case, it can be taken into account as an aggravating factor in sentencing. It is not clear what point there is to creating this additional, independent offence on top of the first. 

Most egregious from a free speech perspective is the second new offence, which creates a recognizance, a kind of  ‘peace bond’ provision, whereby an individual, with the consent of the Attorney General, may lay an information against another person if that individual fears on reasonable grounds that another person will commit a hate crime in future. This is a form of prior restraint, a classic red flag for freedom of expression. An individual need not have previously been convicted of any crime, involving hate or not, for such a charge to be brought against them, and the provisions for conditions open for a judge to impose as part of the recognizance involve measures one would not think to be associated with a crime involving speech: wearing a tracking device, imposing a curfew or alcohol abstinence, banning from a particular place, and drug tests, for up to two years.

It’s difficult to see what could justify this kind of pre-emptive legal strike, and it should be opposed.

Part 3 – The Canada Human Rights Act

The third part of Bill C-63 restores an offence of “Communication of Hate Speech” to the Canadian Human Rights Act. This provision is similar to the previous s. 13 of that Act, which was repealed in 2012 (a repeal supported by PEN Canada at that time). The restored provision would make it a discriminatory practice under that Act “to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination.” Restoring this provision is a mistake.

Complaints under this provision could be brought by any member of the public, and carry penalties of up to $70,000, with up to $20,000 going directly to the complainant. Given that the complainant would bear no personal costs for frivolous or malicious complaints, and would have moreover a prospect of direct financial gain, there is an incentive to bring a case on spec. The provision risks swamping the system with complaints, and being weaponized for personal gain.

Procedural fairness was a central concern with the original section 13 prior to its 2012 repeal, and it would be so again here. Bill C-63’s reintroduced s 13 would allow complainants to remain anonymous and bring a case risk-free and without cost to themselves. Defendants, however, would incur the legal costs of their defence. And though a case may be thrown out as frivolous, the process could still damage an accused.

There is a growing chorus calling for the removal of Bill C-63’s proposed amendments to the Criminal Code and the Human Rights Act.  I agree. These two parts should be considered separately, if at all, so as to allow for better focus on improving the proposed Online Harms Act.

With my best,
Grace

Grace Wescott
President, PEN Canada 


This preliminary response was originally published on March 29, 2024 in PEN Canada’s monthly newsletter, as part of the Grace Wescott’s President’s Message. Subscribe to get monthly updates sent to your inbox.