Government Expert Panel Suggests Regulating Private Communications Through ‘Online Harms’ Legislation

by EditorT

Canadian Heritage Minister Pablo Rodriguez rises during question period in the House of Commons on Parliament Hill in Ottawa on June 16, 2022. (Screenshot from ParlVu).

By Noé Chartier

Many experts on a panel handpicked by the federal government to lay the basis of a future “online harms reduction” bill say private communications should be included under the framework, a Heritage Canada document indicates.

“Some experts highlighted that a lot of times a high level of harmful content, such as terrorist content or child pornography, are shared in private communications instead of on public forums—and that excluding these types of communications would leave a lot of harmful content on the table,” said a summary of discussions held by the expert panel.

“Many experts supported the notion that private communications should be included under the scope of the legislative framework.”

Heritage Minister Pablo Rodriguez announced in March that 12 experts would be holding discussions, also attended by bureaucrats from different agencies, to provide advice on the drafting of an internet content regulation bill.

Ten sessions were held from April to June, and government-provided summaries have been posted online.

While the experts pointed to terrorist content or child exploitation as needing to be countered in private communications, they’ve identified in a separate session that “disinformation” is “one of the most pressing and harmful forms of malicious behaviour online.”

While calling for the government to tackle “disinformation,” the experts nevertheless said the issue would be hard to define in legislation. They also said the government should not be deciding what is true or false.

As for regulating private communications, the experts suggested that platforms use tools that would “mitigate the risk before it emerges” or have reporting mechanisms to address “harmful content.”

“In this way, regulations wouldn’t need to impose a proactive monitoring obligation on platforms to monitor private communications to mitigate harms,” the panel said.

‘Legal yet Harmful’

Freedom of expression is protected in Canada, and hate speech is prohibited under existing laws, but the panel explored how to counter content online that could be lawful yet deemed “harmful.”

“Some experts asserted that a balance would need to be struck between preserving Charter rights while also addressing legal yet harmful content,” reads a session summary.

“It was also stated that lawful but harmful content cannot legally be banned but could be regulated by means other than take-down measures.”

Some experts argued that the law should be left ambiguous to incentivize platforms to “do more to comply” in regulating content, whereas others argued it would offer platforms too much leeway.

Despite disagreements, Heritage Canada said there was a consensus among the experts that a regulatory regime is needed to tackle “harmful content” online.

The experts said the way the government communicates its efforts to regulate content is “important” because “such a framework has the potential to contribute to, erode, or reinforce the public’s faith in Government and other democratic institutions.”

Regarding platforms’ compliance with regulation, the experts said that “public shaming or profit incentives” would be “key to a successful framework.”

Politicization

A previous analysis of the panel of 12 experts showed they mostly share the government’s ideology on different issues such as COVID-19 measures, advocating for more vaccine mandates, labelling alternative viewpoints as “conspiracies,” and criticizing the recent freedom-themed protests.

Some experts warned during the discussions that any legislation introduced to regulate content “must not be susceptible to misuse by future governments.”

Included in the content they seek to regulate is “misleading political communications,” along with false advertising and propaganda.

The discussion summaries are often steeped in progressive jargon.

“Many experts stated that it would be important to find a way to define harmful content in a way that brings in lived experiences and intersectionality,” said one summary.

“They explained that a number of harms online are exemplified by issues like colonization and misogyny, and a regulatory framework would need to recognize these factors.”

Some experts expressed concern that platforms could become over-policed, “labeling activist content like material from Black Lives Matter campaigns as extremist content.”

Digital Safety Commissioner

Another area of consensus among the experts is the need to create a digital safety commissioner.

They said the commissioner should have powers to audit, inspect, administer financial penalties, and launch investigations.

There was disagreement, though, on the scope of powers that should be afforded to this new position. Some experts said it should have “teeth” in order to force compliance.

The idea of creating a digital safety commissioner was already proposed by the Liberals last year, and it features in a technical paper published by Heritage Canada in April.

Complementing the new commissioner position, some experts said there should be a “cyber-judge” to determine the legality of content posted online, citing that platforms do not have the “legitimacy” to make those decisions.

 

Noé Chartier
Noé Chartier is an Epoch Times reporter based in Montreal.

You may also like