Last month George Galloway was granted permission to serve proceedings on Google (the owner of YouTube) for taking 23 days to block a YouTube video in which he was described as a "tramp" who "encourages terrorism" and the beheading of American citizens. The Judge said these allegations were obviously defamatory and that 23 days was an unacceptable delay in the context.
This is a positive development for individuals who find themselves struggling to get online content removed from large platforms. The Defamation Act 2013 provides some protection for Internet Service Providers (ISPs) as passive hosts of content, but this judgment confirms the common law position that they cannot rely on this protection once they have been put on notice.
The Galloway case
While the Order was made in Northern Ireland, where defamation law is traditionally more claimant-friendly than on mainland UK, it is significant in recognising Google's liability as the publisher of the video, due to its failure to take urgent action once on notice. The Judge said he was satisfied that "Google is a necessary and/or proper party" and that damages awarded against the first defendant (who uploaded the video) could alternatively be enforced against Google, who would inevitably be able to pay.
Mr Justice Horner said that "accusing an elected politician of being a supporter of terrorism and of the people who are "beheading American citizens" is going to alarm anyone so accused and to cause him distress." He went on to say, "Google should have acted more swiftly given the serious and alarming nature of the libel." The company may now have to find a way to deal with serious requests in a more urgent manner, if it is to avoid incurring further liability as a publisher of defamatory statements.
The Court reminded claimants of the importance of citing the exact timing and meaning of offensive content in a video: Mr Galloway was criticised in relation to two other videos he complained about, for failing to specify which parts of the video were defamatory. Another video was removed by Google within five minutes of being posted, which the Court accepted as a wholly satisfactory response.
The Court's decision follows the ruling in Godfrey v Demon Internet Limited  4 All ER 342, where an internet service provider (ISP) was held liable as a publisher after taking ten days to remove a defamatory statement. The Court held that Demon was a common law publisher of the material and that because the defendant was on notice for ten days and took no action, it could not rely on the statutory defence.
Section 10 of the Defamation Act provides a protection for ISPs in so far as it prevents claimants from taking action against a person who is not the author, editor, or publisher of a statement complained of, unless it is not reasonably practicable to do so. This means that where ISPs can show they were not a 'publisher' in so far as they were not actively allowing the dissemination of a defamatory statement, a claimant can only take action against them if it can be shown that it would be impossible to pursue those actively responsible.
However, these cases make it clear that once an ISP is on notice that it is hosting a defamatory publication, and it does not take immediate action, it can no longer claim it was not an active publisher. A delay in responding, once on notice, appears key to establishing an ISP's liability. In Bunt v Tilley  1 WLR 1243, the Court held that the ISP had played a purely passive role as the intermediary, and could not, therefore, been seen as a 'publisher' in common law terms.
What this means for claimants
This is good news for claimants who are struggling to get an urgent response from Google and other social media platforms, which can often seem impenetrable to complainants. When reporting offensive material it is common to wait weeks before receiving a response, or for platforms to suggest that complaints be made to the uploader of the video instead. Locating and pursuing the uploader is not always an easy task, particularly if their identity or whereabouts are unknown. It is also likely that Google is inundated with such requests, not all of which raise relevant legal issues, making it hard to respond quickly.
If Mr Galloway is successful, the case could pave the way to other large social media platforms (such as Instagram and Facebook) facing similar requests, making it easier for individuals to manage their online reputations.
Where the identity and location of the content uploader are traceable, it is usually more effective to pursue a claim directly against the individual. Offensive content is not necessarily just defamatory: other actions to consider are privacy, harassment, data protection and offences committed under the Computer Misuse Act 1990.
Another option is to block searches leading to the offensive content, using a Google search removal request form, which was made possible after the Google Spain ruling in 2014 introduced the 'right to be forgotten'. This option will be even more attractive now that Google has confirmed it will remove results from all Google domains when the search is made in Europe. Previously, results were blocked only on European versions of Google and not Google.com. Now, individuals in Europe will enjoy greater protection. The content may still be on YouTube, but it will be harder to find for those who don't know what they are looking for.
If you require further information on anything covered in this briefing please contact
Alicia Mendonca (email@example.com; +44(0)20 375 7614), or your usual contact at the firm on 020 3375 7000.
This publication is a general summary of the law. It should not replace legal advice tailored to your specific circumstances.
© Farrer & Co LLP, February 2016