With the average British student reportedly spending 55 hours online a week the risk of suffering harm via the internet has increased, whether in the form of online abuse, scams, or being misled by misinformation.
It’s timely then that growing public calls for regulation of the online sphere have finally been heard by government in the form of the draft Online Safety Bill which has the ambitious aim of making the UK “the safest place in the world to go online”.
Published last year and currently being reviewed following a Joint Committee report, the draft Bill heralds a new era of regulation for social media companies and other online platforms.
In a departure from the existing system of self-regulation, widely judged to have failed, the Bill will impose a duty of care on social media platforms and other providers of electronic content to prevent harm to users.
It will be enforced by the communications regulator Ofcom, which currently regulates the broadcasting sphere and some online content.
It will force online platforms to stamp out unlawful or “lawful but harmful” content - or face fines of up to 10 per cent of global turnover.
This signals a major departure from the status quo in which social media platforms have typically tried to deny legal liability for user-generated content and its consequences.
Platforms will become responsible for content relating to crimes like and child abuse, fraud, racist abuse and the promotion of self-harm or violence against women, all of which have so far lacked legal consequences when perpetrated online.
Providers will be required to have systematic measures to identify and assess harmful content and remove or otherwise appropriately deal with it, ie to improve their take-down procedures.
They will have to take reasonable steps to prevent proliferation of illegal content and remove it once identified. Illegal content is defined as content which is illegal under UK law or in respect of which the provider “reasonably believes” that it amounts to such an offence. It includes terrorist content, child sexual abuse and exploitation content.
Providers will also have to ensure that children who use their services are not exposed to harmful content, even if the content is not illegal.
There will be a further obligation to tackle “lawful but harmful” content which may be seen by adults.
“Harm” is defined as physical or psychological harm, with harmful content expected to include pornography and violent content and content where a provider reasonably believes there is a risk of significant adverse physical or psychological impact.
The Bill is also expected to make provision to protect freedom of speech including specific duties to protect journalistic content and “content of democratic importance”.
When deciding on safety policies and procedures, providers will have to carry out and publish an assessment of the impact on freedom of expression and user privacy.
They must also “take into account” the importance of freedom of expression of content intended to contribute to a UK democratic political debate and journalistic content (broadly defined) in determining what action they take in relation to it.
What this will all mean in practice remains to be seen, with the Bill due to be put to Parliament for approval this year. Whatever its final form, it will hopefully result in a safer online environment for all, students and staff included.
If you require further information about anything covered in this blog, please contact Athalie Matthews or your usual contact at the firm on +44 (0)20 3375 7000.
This publication is a general summary of the law. It should not replace legal advice tailored to your specific circumstances.
© Farrer & Co LLP, March 2022