The Online Safety Act: tackling online abuse in sport
Insight
Online abuse needs no introduction. Amidst the ever-expanding use of social media, there has been a worrying rise in the abuse levelled at athletes, players, and officials. From the torrent of racist abuse directed at England’s footballers after the Euro 2020 final, to the homophobic trolling of Olympic boxer Nicola Adams and the seemingly constant abuse directed at officials and referees, the sports industry seems to be facing an uphill battle in the protection of its players, athletes, managers, and anyone else connected with the profession. There is, correspondingly, a growing urgency for online platforms to take a proactive stance to protect users from online hate.
As a result of increasing pressure on governments to regulate the digital world (increasingly intertwined with the “real world”), the Online Safety Act 2023 (the OSA) came into force on 26 October 2023. Although a range of established legal and practical avenues already exist to help players and athletes protect their reputations and wellbeing online, the OSA may add an extra layer of regulation specifically targeting abuse and harmful content on the internet.
In targeting illegal and harmful content, the OSA imposes statutory duties on services which host user-generated content (such as social media platforms) and search engines to have adequate systems and processes to protect users. For athletes and individuals involved in the sporting profession, the OSA may therefore translate to a reduction in exposure to harmful content, thereby potentially preserving players’ and athletes’ mental and professional wellbeing.
The Online Safety Act’s new duties of care
To protect adult users, the OSA takes a “triple shield” approach to “illegal content” which includes:
- Preventing services being used for illegal activity,
- Imposing obligations on the most high-risk service providers to remove content banned under their own terms and conditions, and
- Giving users greater control over the content they see and engage with.
Platforms will also face obligations to include (in their terms of service) provisions that specify how users are protected form illegal content and must allow users (and others) to easily report illegal content (or content that is harmful to children) to the platform under the OSA.
Additionally, the OSA introduces the concept of “priority illegal content” which includes terrorism and child sexual abuse content, as well as content that would amount to a criminal offence under existing statutory provisions.
These duties are to be implemented via Ofcom Codes of Practice, which Ofcom is in the process of drafting, finalising and consulting. Ofcom plans to publish its next Codes on illegal content and illegal harms in December 2024, to be completed in March 2025.
Practical implementation of the Online Safety Act
The OSA has been designed to regulate the major players in the social media landscape (Facebook, Instagram, and X). It also regulates thousands of smaller platforms, including websites, online forums and smaller social media platforms where information is readily being shared and where users might interact with others, where these entities meet the criteria specified in the Act and have significant reach in the UK.
Under the OSA, the burden will be on the platform itself to assess the risk of harm and put in place the necessary mechanisms to keep their users safe. In addition, online platforms must now also put in place adequate complaint procedures. For example, Instagram and Facebook have a two-tiered system aimed at reducing harmful content. The platforms purportedly regularly review all content using automated technology and a user can also report abusive content through the ‘report’ link. The platforms assert that this reporting tool triggers the platforms to swiftly review the content; however, in practice it appears that major online platforms can frequently be extremely slow in reaction to both blatantly abusive content and to reports by individual users. This can be particularly problematic when the content is extremely abusive and potentially harmful, in which case urgent legal action may be required to spur the relevant platform to act.
Understanding how best to use these tools and knowing when to escalate complaints to legal action is important for agents, clubs and governing bodies. Equally, it will be important for high-profile athletes to receive adequate training on the tools these platforms provide to minimise the hurtful content they are exposed to online, and on who to turn to in the face of particularly abusive or harmful online content. This is particularly important ahead of major events, such as competitions, international tournaments or events such as the Olympics when such online abuse can spike.
The OSA will also require social media companies to regularly review their platforms to locate abusers who repeatedly create anonymous profiles and take steps to ban repeat offenders. This could allow athletes and their representatives to take pre-emptive measures by flagging repeat offenders to the relevant platform via its reporting tools. In addition, platforms will be required to address the risks that algorithmic tools and hashtags may carry in promoting the availability of illegal content (such as racist abuse). The upcoming Ofcom Code on illegal content and illegal harms should provide more clarity on how platforms must respond to and review for illegal content.
Alternative legal routes
The OSA is, of course, not the only option for athletes and others facing online abuse. Such options include:
- Taking action under the Defamation Act 2013 or under privacy law,
- Taking action under the Protection from Harassment Act 1997, and
- Using relevant provisions under existing data protection legislation, such as the “Right to be Forgotten” regime, to have inaccurate or outdated content removed or de-listed from websites and online search providers.
How effective will the Online Safety Act be in addressing abuse?
The OSA looks to be a positive development for individuals concerned about online abuse. Criminalising harmful online abuse may reduce the abuse athletes face if online abusers (or “trolls”) feel that there may be criminal consequences for their online actions.
However, many of the relevant provisions are yet to be clarified and enforced via Ofcom Codes. It remains to be seen how effective Ofcom will be as the regulator enforcing these provisions, particularly in a landscape where technology is changing rapidly and the OSA needs to be sufficiently flexible to deal with this. In addition, it is clear that significant responsibility for implementation of the OSA remains with the “Big Tech” platforms, and Ofcom’s ability to regulate these online giants remains untested.
We therefore recommend that all high-profile athletes and their representatives remain vigilant in their monitoring of relevant online content and remain mindful of the options available to them in the face of online abuse, misinformation or online defamation, both under the OSA and the additional legal routes identified above.
This publication is a general summary of the law. It should not replace legal advice tailored to your specific circumstances.
© Farrer & Co LLP, November 2024