Session Eight: Disinformation
What is a Worksheet?
Each advisory group session will be supported by a worksheet, like this one, made available to the group in advance of each session. The goal of these worksheets is to support the discussion and organize feedback and input received. These worksheets will be made public after each session.
Each worksheet will have a set of questions for which the group members will be asked to submit written responses to. A non-attributed summary of these submissions will be published weekly to help conduct the work in a transparent manner.
The proposed approach in each worksheet represents the Government’s preliminary ideas on a certain topic, based on feedback received during the July-September 2021 consultation. It is meant to be a tool to help discussion. The ideas and language shared are intended to represent a starting point for reaction and feedback. The advice received from these consultations will help the Government design an effective and proportionate legislative and regulatory framework for addressing harmful content online. Neither the group’s advice nor the preliminary views expressed in the worksheets constitute the final views of the Department of Canadian Heritage nor the Government of Canada.
Discussion Topic
Should the Government’s legislative and regulatory framework for harmful content online address disinformation?
Objectives
- Obtain views on the Government’s role in addressing disinformation. Disinformation is a growing and serious concern. Like other online harms, it is driven and exacerbated by new digital technologies. But it is different from other forms of harmful content online partly because of the way in which it implicates freedom of expression. Disinformation cannot likely be addressed in the same way as other online harms.
- Explore new ways to address and mitigate the effects of disinformation. There is a range of possible responses to disinformation, including legislation and regulation, programs, policy, funding, convening, strategic communications and norm-setting.
Starting Points
- Disinformation poses unique challenges. Disinformation is often grouped with other forms of harmful content online but is uniquely challenging compared to other categories of content in at least three respects. First, the evidence base for harm is not as well established, in academic research and jurisprudence alike, as it is for other categories of harm like hate speech. Second, the specific harm associated with disinformation has its basis in determining whether the content in question is true or not. Third, unlike other kinds of harm, such as hate speech, there is no corresponding crime of disinformation in the Criminal Code.
- A Canadian public policy response to disinformation must be guided by the limits posed by the Charter of Rights and Freedoms and Canadian jurisprudence, conventions and norms. Canada has an established judicial tradition when it comes to limitations on freedom of expression, whereby limits to freedom of expression are justifiable under the Charter only to the extent that they 1) are based on reasonable apprehension of harm, and 2) are reasonable and proportionate. Given the above, the range of responses to disinformation is predicated on the extent to which evidence of harm can be established and, whether a prospective response (legislative, regulatory, policy or programmatic) is reasonable and proportionate.
- There are multiple initiatives in place targeting disinformation and additional paths forward are sought. These initiatives include research and digital media/civic literacy through the Digital Citizen Contribution Program, monitoring foreign-based disinformation campaigns through the Global Affairs Rapid Response Mechanism, and preventing disinformation from affecting elections through the Critical Incident Public Protocol and the Canada Declaration on Electoral Integrity. Additional programs could be used, expanded, or updated to help directly address disinformation or support other measures that may help tackle disinformation. For instance, the Government has made several investments in supporting news and journalism over the last number of years, including tax benefits to support the industry. At the Department of Canadian Heritage, programs like the Canada Periodical Fund and the Local Journalism Initiative are critical to supporting a diversity of newspapers, magazines, and community broadcasters across the country.
- An effective response to disinformation must involve social media companies. The Department of Canadian Heritage acknowledges that social media platforms have attempted to respond to disinformation through a variety of means. However, despite having the best view into the real and potential harm posed by disinformation and what can be done to address it, social media platforms remain reticent to share information. Social media companies have more to contribute to the response to disinformation. This could include sharing more of what they know about the issue and how it manifests on their services; collaborating with governments and civil society to develop tools to address disinformation with the public good in mind; and partnering with governments to set norms and standards for how disinformation is identified, monitored and moderated.
Overview of Proposed Approach
- Articulate a comprehensive account of the harm posed, based on evidence as it exists – and identify gaps for further research and analysis. A comprehensive articulation of the harm posed by disinformation and rationale for action could consider disinformation’s more recent effects (like anti-vaccination campaigns and climate change denialism) as well as its more insidious effects on Canada’s social fabric and public discourse.
- Explore if and how to include disinformation in online safety legislation and regulation. A central challenge in doing so is defining disinformation for regulatory purposes. One commonly used definition is that disinformation consists of false, misleading or untrue content that is shared with harmful intent. But given the challenges posed by ascertaining intent with respect to online content, it will be important to determine whether disinformation can be defined for legislative and regulatory purposes absent knowledge of intent.
- It will also be important to explore which obligations could be put on regulated online services regarding this type of harmful content, with an emphasis on requirements for social media companies to report on the extent and moderation of disinformation on their services. Transparency and reporting requirements could provide much-needed tools to understand and develop responses to disinformation while respecting the bounds of justifiable Government intervention. They could:
- Allow the public to make more informed decisions when choosing and using online platforms. Developing a publicly available knowledge base on disinformation will help promote media literacy on disinformation among Canadians, facilitate consistent industry standards across platforms, and help refine and target media literacy programming.
- Contribute to Canada’s understanding of how disinformation spreads. Most platforms already have policies in place to counter disinformation. Through regulation, they could be required to categorize and track the volume of disinformation on their services. Reporting and transparency like this would deepen the understanding of how disinformation narratives gestate and spread in Canada for government, civil society, and academia alike.
- Develop norms on how best to stop the spread of disinformation. Norms are underutilized tools when discussing mitigation of disinformation. Over time it is expected that a greater amount of information and data, support for industry standards, and boosts to media literacy would result in norms for the monitoring, moderation and response to disinformation.
Supporting questions for discussion
- Obtain views on the Government’s role in addressing disinformation
- Is there a role for the Government to play in helping Canadians manage the effects of disinformation in their lives? Is there a legislative response? If not in legislation, how else might the Government respond to disinformation?
- How could the Government define disinformation in a legislative context?
- If a required element of a definition is a potential for harm, to what extent would legislation need to define harm?
- How might a definition for online disinformation handle the challenge of assessing intent in the digital space?
- Should a legislative definition of disinformation include things like phishing scams, fraud or leaks?
- Explore new ways to address and mitigate the effects of disinformation.
- What are the policy benefits of including disinformation in its legislative framework to address harmful content online? What are the risks of excluding it?
- Is the principle of “falsehood” that is characteristic of disinformation too subjective or undeterminable to be placed in legislation?
- How would the benefits of transparency and reporting requirements on disinformation outweigh the potential risks to privacy and freedom of expression, if at all?
- What measures other than transparency and reporting for disinformation could be considered regarding additional regulatory obligations related to disinformation?
Page details
- Date modified: