Breadcrumb trail

Submission to the Department of Canadian Heritage on the Proposed Approach to Address Harmful Content Online  

Submitted by: Ms. Heidi Illingworth, Ombudsman

Office of the Federal Ombudsman for Victims of Crime

September 2021


Submission on the Government’s proposed approach to make social media platforms and other online communications services more accountable and more transparent when it comes to combating harmful content online.


As Federal Ombudsman for Victims of Crime, my mandate is to help ensure the rights of victims and survivors of crime are respected and upheld, and that the federal government meets its obligations to victims. In addition to assisting individual victims, I also have a responsibility to identify and bring forward emerging and systemic issues that negatively affect victims and survivors of crime at the federal level.


In June 2021, the Minister of Justice introduced Bill C-36: An Act to amend the Criminal Code and the Canadian Human Rights Act and to make related amendments to another Act (hate propaganda, hate crimes and hate speech) in the House of Commons. The intention of Bill C-36 was to amend the Criminal Code to create a recognizance to keep the peace relating to hate propaganda and hate crime and to provide a definition for “hatred”.

Now the Government of Canada is proposing a new approach to regulating social media and combating harmful content online. The purpose of the approach is to hold entities accountable to regulate online harmful content. The proposed legislation would apply to online communication service providers (e.g., Instagram, TikTok, Twitter, Facebook, Pornhub), and exclude private communications and telecommunications service providers. The proposed legislation would target five categories of harmful content: terrorist content; content that incites violence, hate speech, non-consensual sharing of intimate images, and child sexual exploitation content. The legislation would create a new Digital Safety Commission of Canada, comprised of three bodies (the Digital Safety Commissioner of Canada, the Digital Recourse Council of Canada, and an Advisory Board), to operationalize, oversee and enforce the new system.


In my view, it is critical that a new legislative and regulatory framework be developed to require Online Content Service Providers to take all reasonable measures to identify harmful content that is communicated on their platform and to make that harmful content inaccessible to persons in Canada. There must be accountability for victims.

In August 2020, the OFOVC provided a submission to the consultation on Online Hate conducted by the Justice Committee. The main premise of the submission was that any proposed solution to the proliferation of hate speech online would need to address five key issues:

  • Lack of respect for diversity;
  • The exponential increase in anti-social behaviour online;
  • The underlying causes   motivating perpetrators;
  • The lack of data; and
  • The lack of regulation of online service providers.

These same five issues should be addressed in the context of all harmful content found online.

I also believe any legislation intended to address harmful online content should incorporate measures relative to victims. To that end, the OFOVC has prepared a submission outlining considerations and recommendations to reflect the concerns and needs of victims.


Hate speech

Available data tell us most victims of hate speech do not report it, often because they do not believe the authorities will take their complaint seriously. Reported offences are seldom prosecuted, often because the perpetrator cannot be identified. When prosecutions do take place, many cases take a long time to process, few lead to convictions, and even fewer to a custodial sentence.1

The data also tell us police-reported hate crimes targeting race or ethnicity increased substantially from 2019 to 2020 in Canada.2 The Black, East or Southeast Asian, South Asian and the Indigenous populations were the targets of the majority of reported hate crimes during this period. However, we should treat these data with caution because it is unknown whether the increase is due to an increase in incidents, an increase in reporting, or a combination of the two.

While harmful online content affects everyone in unique ways, we know its effects are unequal. Thus, it is crucial the proposed legislation acknowledge vulnerable populations (e.g., Indigenous and Black persons, women, and members of the 2SLGBTQ+ community) are disproportionately affected by online harmful content. Misogynist online content is of increasing concern to OFOVC and has a disproportionate negative impact on persons who identify as women. The online community known as “incels”, describing their romantic troubles — “involuntary celibacy” are almost entirely men and boys who use online forums to blame women for their sexless lives.

They openly call for other incels to follow up with “acid attacks” and “mass rape.” This online community praises mass killers and over the past two decades has grown with members somewhere in the tens of thousands, who have fallen under the sway of a profoundly sexist ideology that they call “the blackpill.” It amounts to a fundamental rejection of women’s sexual emancipation, labeling women shallow, cruel creatures who will choose only the most att ractive men if given the choice. The OFOVC believes we must be prepared to confront this hateful ideology that develops online but has the potential to play out in real life, as seen in the domestic terrorist vehicle-ramming attack on April 23, 2018, in Toronto, Ontario, Canada. We recognize the intersection between this age-old misogyny and new information technologies, which can lead to everyday acts of violence ranging from harassment to violent assault.

1 Statistics Canada: Figure 1 In only 7% of reported cases are offenders actually convicted. Only 3% serve a custodial sentence.

2 Ibid

Misogynistic online content or hate speech must be included in any definition regarding content that incites violence or hate speech.

It is our view that the proposed definition of hate speech in Bill C-36 is not written in plain language. Since many Canadians do not have English or French as their first language and citizens have differing levels of education, it is important for new legislation to make sense to lay persons. If people are to obey the rules, they first need to be able to understand them. This applies to those who are victims of hate speech as well: to make a complaint, they need to be able to understand what behaviour constitutes the offence. This issue should be addressed before reintroducing the Bill in the House of Commons.

Proliferation of child sexual exploitation content and the non-consensual sharing of intimate images

According to Statistics Canada, in 2020 there were over 7,200 cybercrime -related child pornography violations, up 35% from 2019.3 Statistics Canada also reported an increase of 10% in the non-consensual sharing of intimate images—sometimes known as “revenge porn”— from 2019 to 2020.4 Again, it is unclear whether the increase is a true increase, an increase in reporting or a combination.

The social media industry has developed filters to screen content before uploading, thus providing an opportunity to identify images portraying child sexual exploitation and prevent it from ever reaching public view. This was the recommended approach in a 2020 report focused on the proliferation of child sexual exploitation material on the internet in the United Kingdom.5

As an example, in 2019 Facebook (includes Instagram) instituted a quarterly report “Community Standards Enforcement Report”.6 The report includes information about what the company is doing to protect children and data on how much content depicting child sexual exploitation they detected and removed. Facebook has reported that they detect the majority of this content before it comes to the attention of users.

However, the data cited above from Statistics Canada on reported incidents of cyber-related child sexual exploitation imply the filtering mechanisms may not be fully effective or that they are not universally applied. There may also be other channels transmitting this material.

Pre-upload screening places the burden of policing the internet squarely on the industry profiting from it. Were it fully effective, this approach could help to avoid many of the negative effects experienced by victims of child sexual exploitation and/or the publishing of intimate images without consent, simply by filtering such images out of the stream before they can be uploaded. Similarly, hate speech or content promoting or supporting terrorism would simply not appear on regular industry channels.

3 Statistics Canada:

4 Ibid

5 The Internet: Investigation Report, March 2020: Recommendation 1, p. 102. https://news-sophos.go-

6 Facebook:

Social media companies and law enforcement agencies should work closely together to both improve the performance of the companies in this regard and to identify and prosecute offenders.

For those concerned about the issue of freedom of expression, there are precedents in other media: newspapers do not publish material that does not conform to community standards and radio and television broadcasters can utilize technology to delay live feeds for similar purposes.

Finally, the COVID-19 pandemic has not only had an impact on Canada’s economy but may also have played a role in increases in some police-reported cybercrime. Stay-at-home orders and lockdowns across the country have meant more people were at home. Children and youth were spending more time online, which increased their vulnerability to online harmful content. Authorities should make strenuous efforts to alert parents to the risks their children face online and inform them as to how to reduce those risks. Furthermore, it is of note that school staff make 90 percent of all reports of child abuse. As children have been out of school due to the pandemic, there is a risk that children may sometimes be trapped at home with the person who is exploiting them, and unable to report abuse to a trusted adult such as a teacher. My office remains concerned about the increased vulnerability of children – especially those already at risk of experiencing abuse.

Need for training on implicit bias, cultural humility, victim-centred, and trauma-informed approaches for the proposed Digital Commissioner, Digital Recourse of Canada and the Advisory Board

Implicit bias is a mental process resulting in feelings and attitudes about people based on factors such as race, age and appearance that may influence perceptions and actions. It is an unconscious process, thus we are not aware of the negative biases we develop over the course of our lifetime.7 Implicit bias supports stereotypes. It is important to understand the causes of implicit bias and intentionally work to bring it to the conscious level in order to mitigate the negative consequences. Cultural humility requires individuals to self-reflect on their own personal and cultural biases and to take note of the significant cultural realities of others.8

Using a gender-based analysis plus (GBA+) tool can help identity how different populations are affected by government policies, programs and services, taking into account intersecting identity factors (age, disability, education, language, geography, culture, income, and sexual orientation).9 This type of analysis may help the proposed new regulatory bodies identify whether there are some groups that may benefit from the proposed initiatives more than others.

Since the role of the proposed Digital Safety Commissioner, Digital Recourse Council and the Advisory Board would be to oversee online content moderation, a GBA+ lens should be used as

7 Workplace strategies for mental health. (2020, January 3). Implicit Bias. Workplace strategies for mental health. Retrieved from

8 Yeager, K. A., & Bauer-Wu, S. (2013). Cultural humility: essential foundation for clinical

researchers. Applied nursing research : ANR, 26(4), 251–256.

9 Department of Justice Canada: plus/what-gender-based-analysis-plus.html#about

a guide. The particular needs of and barriers faced by groups disproportionately affected by harmful online content, such as people who identify as women and girls, Indigenous peoples, members of racialized communities, religious minorities, 2SLGBTQ+, gender-diverse communities

and persons with disabilities should be considered.

A victim-centred, trauma-informed approach is also necessary to empower victims and survivors of online harmful content. The 2019 General Social Survey (GSS) informs us most crime goes unreported.10 It is important that victims of harmful online content not only feel safe reporting their victimization, but also feel confident they will be supported afterwards. Using a trauma- informed approach will help to avoid re-traumatization, and put the focus on victims’ rights, safety, well-being, expressed needs and choices, while ensuring the empathetic and sensitive delivery of services.

The proposed mechanism to regulate harmful online content

Creating a new, separate, administrative process under the proposed new Digital Recourse Council of Canada to adjudicate complaints regarding harmful online content may not be the answer. A bureaucratic process can be both lengthy and expensive to operate, with no guarantee of efficacy.

If the Government decides to move forward with the proposed regulatory mechanism legislation in clear, plain language should be incorporated in every aspect of this approach, especially within the complaints process. The design should incorporate tools to help complainants understand whether their issue meets the established criter ia for harmful online content. The purpose is not so much to screen out frivolous complaints (although it will facilitate such screening); rather, it is to enable self-screening to reduce the number of complaints not meeting the criteria. There should be consequences for filing such complaints to act as a deterrent to attempts to abuse the complaint mechanism.

While the Digital Recourse Council of Canada may be sufficient to address complaints such as hate speech, it should refer other harmful online content to the competent authorities, as is stated in the current proposal:

  • If content falls within the criminal sphere (child sexual exploitation; sharing of intimate images without consent), then the criminal justice system has jurisdiction.
  • If content falls within the security sphere (terrorism), then the security service has jurisdiction.

Complaint process

Another important consideration of making the proposed legislation and the regulatory/ administrative regime accessible is developing a complaint process which is both easy to understand and easy to use. Canadians must be able to have confidence in the system. However, referring complaints to an administrative tribunal can make the process bureaucratic and place a heavy burden on the complainant to prove their case.

10 Statistics Canada:

As an example, in 2020, the Human Rights Commission annual report indicated 49,000 people contacted the Commission to complain. The Commission accepted 1,030 complaints. 11 There is no hard information about how long it takes to resolve a complaint. These data imply:

  • The criteria used by the Commission to screen complaints are not well understood by the general public; and
  • The process is resource-intensive.

Given the performance of the Human Rights Commission in adjudicating human rights complaints, applying such a bureaucratic process to hate speech complaints may not be any more effective than the current criminal process. Additionally, this process could make it a very time-consuming and expensive mechanism.

The potential importance of restorative justice in reconciling differences

The Government of Canada has indicated its dedication to furthering the use of restorative justice practices in Canada. Restorative justice is an alternative approach to traditional justice, with a focus on reparations and addressing the harm caused by the crime, while holding the offender accountable.

Importantly, restorative justice allows victims and survivors to play a central role in the justice process, as opposed to the traditional role of the victim as a mere witness for the state in criminal proceedings. It also allows offenders to identify and address their needs for resolution, which can help to give context to the crime and highlight areas for improvement within the community. Providing offenders with the opportunity to address the reasons for their offending behaviour and offer their perspective on the crime allows them to take responsibility for the harm done to the victim and the greater community.12 This can result in psychological benefits for the victim, such as decreased fear and anxiety about re-victimization, decreased anger, increased sympathy towards the offender, and even decreased post-traumatic stress symptoms, which have positive implications for their overall well-being and ability to heal.13

Additionally, power dynamics often play an important yet undervalued role in restorative justice practices.14 Variables such as age, gender, socioeconomic status or race can create explicit and implicit biases amongst the facilitators and participants, leading to a power imbalance that may be disadvantageous to one or more of the parties.15 Restorative justice can be an effective tool in addressing harmful content but, ultimately, regard for power dynamics and avoiding victim re - traumatization must be at the forefront. Attention must be paid to avoid re-creating the imbalances and negative experiences seen elsewhere in the criminal justice system. It is important that experienced, trained professional mediators approach restorative justice

11 Canadian Human Rights Commission: AR-2020-ENGLISH-WEB-FINAL.pdf p.35

12 Department of Justice Canada,

13 Evans et al., (n.d.). Restorative Justice: The Experience of Victims and Survivors. Victims of Crime Research Digest No. 11. Retrieved from, 14 Lyubansky, M. and Shpungin, E. Challenging the power dynamics in restorative justice.

15 Ibid

measures delicately to avoid re-traumatization especially for some types of crime, such as child sexual exploitation and terrorist content.

Prevention: the best cure

The Department of Canadian Heritage’s anti-racism initiative, Building a Foundation for Change: Canada’s Anti-Racism Strategy 2019–2022,16 is investing millions of dollars to combat racism and discrimination at the grassroots level via a grants and contributions program which provides funding to community organizations. According to the Departmental Plan, an evaluation of the program will take place in 2022.The results of the departmental evaluation of the program could be an important element to inform any future strategy intended to address harmful online content.

However, racism and other forms of intolerance—and their corollary—discrimination, are not local issues. While supporting the efforts of community groups is a positive step, such initiatives tend to be a localized remediation tactic rather than a general preventative strategy. I suggest a more pan-Canadian approach is necessary to address the larger issues.

A targeted educational component is an important element of a public health strategy. Focusing on emphasizing similarities between groups, encouraging acceptance of differences, and fostering critical thinking to reduce reliance on myths and stereotypes are just a few techniques to counter the effects of intolerance and discrimination.

As an example, the OFOVC has frequently pointed to the importance of providing cultural humility training for all justice system employees across the country. The training program should include elements such as:

  • Raising awareness of the issues of racism, intolerance and discrimination and the harm they do;
  • Focusing on cultural awareness and humility. Cultural humility is a relationship-based framework intended to address and invite equity into spaces where there has traditionally been inequity and privilege, such as the criminal justice system. Cultural humility invites those who embrace it to consider others as experts of their own lived experiences, which changes relationship dynamics to remove ego and prioritize humility. It emphasizes that we are more alike than we are different;
  • Highlighting risks inherent for affected persons to make complaints about justice personnel based on:
    • Human rights violations;
    • Victims’ rights violations.


  • Develop clear plain language definitions of the categories of harmful content

It is essential to make the legislation and any regulatory/administrative regime accessible to all Canadians by defining the categories of harmful online content using clear, plain language and

16 Canadian Heritage: engagement/anti-racism-strategy.html

and must include misogynistic online content or hate speech in any definition regarding content that incites violence or hate speech. Examples of harmful content should be included for illustrative purposes.

Develop common terms of use for users of online services

The purpose of common terms is to describe a consistent standard of acceptable behaviour across platforms. The terms of use should be limited in number and the language should be clear. They would include users agreeing to the screening of content and acknowledging that any attempt to upload potentially illegal content would result in notification of the appropriate authority. Penalties should include temporary or permanent suspension of user privileges, depending on the seriousness of the offence. Potential users would need to read and agree to these terms before accessing the service.


Focus on prevention: Adopt a public health approach with a strong educational component

Hate speech—and other harmful online content—are not only harmful to the direct objects of those acts. They also negatively affect our whole society, just as do infectious disease pandemics. Adopting a public health approach means focusing on prevention in addition to just response or treatment in order to reduce the number of hateful incidents and improve the overall health of society.

Explore the options for restorative justice measures

Incorporating restorative justice practices may ultimately produce more satisfactory outcomes for victims and offenders alike.



I encourage the Government of Canada to develop new legislation from a victim-centred, trauma-informed perspective to respond to the needs of victims and survivors of harmful online content, who deserve swift action by corporations to remove harmful content from online platforms. Victims also deserve accountability when harmful content is propagated. The creation of an accompanying complaint system must be accessible and victim-centred, and the use of plain language is essential. A public health model should be instituted in addition to legislative and regulatory framework to help prevent the proliferation of harmful online content, and consideration given to the use of restorative justice measures to respond,  where warranted.