The Liberal administration has proposed a massive new system that would be designed to ensure that unlawful content, including as child pornography and hate speech, would not show up on popular social media platforms like Facebook, YouTube, Pornhub, and Twitter. The broad plan was released on Thursday and posted online for Canadians to comment on until the end of September.
There are five sorts of criminal content that should be kept off the web, according to the proposal: child pornography, terrorist content, incitements to violence, hate-speech, and non-consensual sharing of intimate photographs.
Platforms that do not comply might face severe penalties, and the proposal opens up new avenues for police and Canada’s intelligence agency to intervene in instances involving possible unlawful behaviour being carried out online.
While the plan covers prominent social media platforms, it would not include private chats through apps like Rogers, Telus, Bell, WhatsApp, or Facebook Messenger.
In an interview to The Star, Heritage Minister Steven Guilbeault told, “What we’re presenting to Canadians reflects what we feel is the best way forward. We’re asking platforms to take their responsibilities in ways that they haven’t so far and so I think that for the vast majority of Canadians, what we’re proposing will make a lot of sense.”
Platforms would be expected, under the new guidelines, to remove illegal content noticed by users from their websites and to maintain sophisticated surveillance systems. Platforms would be forced to be transparent in their actions in order to comply with the law. Or, they might face fines up to $10 million or 3% of their global total revenue – whichever is larger.
It is possible for the commissioner to impose a punishment of up to $25 million on a firm for non-compliance.
Outlining a framework for users of platforms to appeal judgments after exhausting appeals with platforms themselves, the new commissioner would be supported by an advisory council and the “Digital Recourse Council of Canada,” which would administer a tribunal system. This council would have the authority to remove content.
In the words of Vivek Krishnamurthy, a law professor at the University of Ottawa who is concerned about the bill,
it “feels very much like a court that is going to be passing judgment on the legality of speech.”
“We want to make sure that there are strong procedural safeguards and that the decisions that are made … conform with rule of law principles.”
In the end, he said, the proposal is “not really going to change much when it comes to mainstream technology platforms.”
“In some ways, we should view this as an illegal-content bill rather, than an online-harms bill,” Krishnamurthy said. “Many kinds of online harms are just completely unaddressed by this, you know. Disinformation, for example.”
The Canadian Security Intelligence Service (CSIS) or police could be notified by platforms under two proposed guidelines. if the content poses a threat to public order and safety, law enforcement could be alerted.
Guilbeault said that what they’re proposing is simply giving law enforcement tools that they have in the physical world and allowing them to be used in the digital one.
However, he added that decisions haven’t been finalized and that “this is a genuine consultation process.”
“We want to hear people’s views on this.”