Three Key Provisions In The New Online Safety Bill 2021
The Government recently announced the tabling of the new Online Safety Bill 2021 (the Bill). It is one of the biggest overhauls of internet law in the world, bringing legislation and criminal liability up to date with how the internet is used in the 2020s. The government states the Bill is globally “groundbreaking” and that it will bring in “a new age of accountability for tech and bring fairness and accountability to the online world”.
At 145 pages plus 123 pages of explanatory notes and a 146-page impact assessment, the Bill is not exactly light summer reading. Therefore, we have distilled the contents into three key provisions that all internet providers and users should be aware of.
One – Companies in scope of the Bill will have a duty of care towards their users
On 15 December 2020, the Government’s Online Harms White Paper was presented to Parliament. The Paper was designed to ensure internet companies take responsibility for the online safety of their users. In line with the White Paper, companies in scope will have a duty of care towards their users so that what is unacceptable offline will also be unacceptable online.
The duty of care will encompass:
- Ensuring quick action is taken against hate crimes, threats, and harassment of online users and a requirement that companies remain true to the standards they promise users they will uphold.
- Large social media sites such as Facebook (referred to as Category 1 services) must act on lawful content that nevertheless is harmful, for example encouraging self-harm or suicide, abusive trolling that falls below the threshold of a criminal offence, and mis/dis information. A statement of how such harms will be addressed must be included in all Category 1 services’ Terms and Conditions and they will be held to account by Ofcom.
- Ofcom will have the power to prosecute senior managers of Category 1 services if they refuse to provide requested information to the Regulator. This is a reserved power that can be introduced following a two-year review (post the Bill coming into force) if the Category 1 tech companies are not complying with their new responsibilities.
Provisions requiring companies to report child sexual exploitation and abuse (CSEA) content identified on their services will be presented to Parliament in the final draft of the legislation.
Ian Russell, who established the Molly Rose Foundation after his 14-year-old daughter took her own life following the viewing of content that, according to the Coroner, “was too disturbing for an adult to look at”, said:
“The Molly Rose Foundation and Molly’s family say government internet regulation can’t come soon enough and welcome this important step towards a safer internet for all.
It is vital to focus the minds of the tech platforms, to change their corporate culture and to reduce online harms, especially for the young and the vulnerable. Now is the time for the platforms to prioritise safety rather than profit; it is time for countries to change the internet for good.”
Two – Internet companies will be forced to take responsibility for fraudulent user-generated content
The internet provides a mind-boggling array of opportunities for fraudsters who can operate in the web’s murky corners, commit crimes, and slip away into the darkness. Unfortunately, innocent people and businesses caught up in online fraud can face criminal liability.
The Bill provides that for the first time, online companies will have to take responsibility for stopping fraudulent user-generated content, such as social media posts. This includes romance scams and fake investment opportunities posted by users on Facebook groups or sent via Snapchat.
Three – Freedom of expression must be protected
In scope organisations will use algorithms to meet many of their compliance obligations under the Bill. The danger is that these algorithms will remove innocent content and repress freedom of expression. Therefore, all in-scope companies will need to consider this consequence and put safeguards set by Ofcom’s Code of Practice in place to protect the right to freedom of expression.
As well as establishing an easily accessible appeals procedure for users who discover their content has been removed, Category 1 services will be required to publish up-to-date assessments on any impact complying with the Bill has had on freedom of expression and the mitigating actions they have taken to ensure this cherished human right is protected.
Final thoughts
Given the robust regulatory provisions contained in the Bill and the fact that Big Tech has been notoriously reluctant to cooperate with the world’s governments regarding the extent of its liability for user content, it is likely that Parliamentary debates will be lively. There is already concern around the scope of the provisions, with some claiming that freedom of speech will be restricted regardless of any safeguarding measures and others arguing that when it comes to matters such as online pornography, the Bill does not go far enough to protect the vulnerable, especially children.
We will keep you updated as the Bill progresses through Parliament.
If you require a criminal defence for online fraud, please call us on 02476 231000 or email enquiries@askewslegal.co.