According to a Facebook executive, the internet industry “needs regulation” since it should not be permitted to create its own standards on matters such as harmful online content.
“Government regulation can establish standards that all companies should meet,” says Monika Bickert, Facebook’s vice president of content policy.
Her remarks come as digital behemoths and some of their most ardent critics convene in Parliament this week to discuss new laws for dealing with dangerous online content.
Following the killing of MP Sir David Amess in his district last week, Culture Secretary Nadine Dorries stated that online hate has “poisoned public life” and that the Government has been prompted to re-examine the planned Online Safety Bill.
“While there will no doubt be differing views, we should all agree on one thing: the tech industry needs regulation,” Ms Bickert said in the Sunday Telegraph.
“At Facebook we’ve advocated for democratic governments to set new rules for the internet on areas like harmful content, privacy, data, and elections, because we believe that businesses like ours should not be making these decisions on our own.
Sir David’s death, according to Ms Dorries, could not have been prevented by a crackdown on internet harassment, but it had underlined the dangers that public figures face.
There have been calls for social media companies to provide over data more swiftly and to remove information more promptly as well. The bill should also require platforms to avoid using their algorithms to propagate hostile information.
“Once Parliament approves the Online Safety Bill, Ofcom will ensure that all technology companies are held to account,” Ms Bickert wrote in the newspaper.
“Companies should also be judged on how their policies are enforced,” she says.
For the past three years, Facebook has been disclosing data on how it handles dangerous information, including how much of it gets viewed and removed. The firm is also audited separately.
Ms Bickert wrote: “I spent more than a decade as a criminal prosecutor in the US before joining Facebook, and for the past nine years I’ve helped our company develop its rules on what is and isn’t allowed on our platforms.
“These policies seek to protect people from harm while also protecting freedom of expression.
“Our team includes former prosecutors, law enforcement officers, counter-terrorism specialists, teachers and child safety advocates, and we work with hundreds of independent experts around the world to help us get the balance right.
“While people often disagree about exactly where to draw the line, government regulation can establish standards all companies should meet.”
Facebook has a financial incentive to delete harmful information from its sites, she claims, since “people don’t want to see it when they use our apps, and advertisers don’t want their advertisements next to it.”
As detection of hate speech has improved, the amount of hate speech viewed on Facebook is around five views per 10,000.
“Of the hate speech we removed, we found 97 percent before anyone reported it to us, up from only 23 percent a few years ago,” Ms Bickert said.
“While we still have a bit of a way to go, enforcement reports demonstrate that we are progressing.”