Prime Minister Scott Morrison announced on Sunday that Australia will draft legislation requiring social media companies to share information about users who make defamatory comments.
After the country’s top court determined that publishers can be held accountable for public remarks on online forums, the government has been looking into the extent of platforms’ accountability for defamatory content uploaded on their sites, such as Twitter and Facebook.
As a result of the verdict, some news organizations, such as CNN, have blocked Australians from accessing their Facebook sites.
The expanded court powers are described in a media release outlining amendments to the country’s defamation legislation as a means of combatting bad actors online.
However, it raises concerns about how the new laws will truly safeguard Australians in practice, as well as how platforms would be pushed to comply.
The prime minister said the reforms to the country’s defamation laws would be “some of the strongest powers to tackle online trolls in the world” in a speech delivered at a press conference also attended by Attorney-General Michaelia Cash — rhetoric in line with Morrison’s previous statements about the country’s approach to regulating big tech companies.
The prime minister also reaffirmed his opinion that anonymity is crucial to trolls’ online freedom, implying that the proposed new legislation will address this issue.
Morrison claimed that social media was far too often a place where “the anonymous can bully, harass, and ruin lives without consequence.”
“The online world should not be a wild west where bots and bigots and trolls and others are anonymously going around and can harm people,” Morrison said at a televised press briefing. “That is not what can happen in the real world, and there is no case for it to be able to be happening in the digital world.”
The new legislation would include a complaints procedure, allowing those who believe they are being defamed, bullied, or insulted on social media to demand that the item be removed from the network.
If the content is not removed, a court may order a social media platform to reveal the identity of the commenter.
“Digital platforms – these online companies – must have proper processes to enable the takedown of this content,” Morrison said.
“They have created the space and they need to make it safe, and if they won’t, we will make them (through) laws such as this.”