Mon 8 Apr 19
The UK government’s publication of a plan to make internet companies responsible for harmful content posted on their sites has been welcomed by our internet law expert, Professor Lorna Woods, but with concerns at the proposal's limited scope.
The Online Harms white paper, which seeks to establish a social media duty of care, includes elements identified in her work with former government advisor Will Perrin, for the Carnegie UK Trust.
Social media can be a force for good, but recent high-profile cases, such as the death of 14 year-old Molly Russel and the live streaming of the Christchurch mosque attacks, have put the spotlight back on how best to regulate online content and protect users.
The proposal, which ministers claim would make the UK a "world leader" in its protection of social media users, places responsibility at the door of internet firms. A regulator would have the power to fine companies who fall short and could make senior managers personally liable for failings. Ultimately, websites that fail to comply could be blocked from operating within the UK.
Writing in the Daily Telegraph, who launched a Duty of Care campaign last year in response to her research, Professor Woods welcomed the white paper as "a major step forward" which, if passed into law, would force companies to "fix the plumbing, not just mop-up after a leak."
She did, however, express concern about the limited scope of the proposal.
While noting the harm caused by online abuse of public figures, the white paper fails to give the proposed regulator power to crack down on sexist abuse. "Given the terrible abuse of women online, we feel that the Government should have included misogyny as a harm for the regulator to tackle. This would have sent a strong signal to the tech companies, which largely ignore it," she said.
It also fails to address economic crimes such as online fraud, the theft of intellectual property and scams.
The pair’s original proposal was published in 2018, with an amended version published earlier this year. It seeks to balance the reduction of harm, with a number of other considerations, including free speech and the need to encourage technical innovation. Its recommendations included a statutory duty of care backed by an independent regulator.