When advising data controllers on cyber security and their obligations under the 7th Data Protection principle, one message keeps coming through from the ICO: encryption, encryption, encryption. Whether for big companies or small businesses, schools and charities, encrypting employees' devices is – according to the Information Commissioner's express guidance – the difference between a lost smartphone or stolen laptop being classed as a reportable breach, or just an unfortunate fact of life.
Likewise when sending or receiving sensitive personal data electronically, organisations should give consideration to whether their email systems – or (shudder) third party file transfer services – are sufficiently secure, or if an encrypted transfer would be more appropriate.
But it seems not all governmental organisations share the ICO's enthusiasm for the e-word. This is the present battleground between the FBI and Apple in the US, for example, leading this week to Tim Cook's already-famous 'open letter' to iPhone customers and the general public (refusing to allow the government a back-door key to the device's operating system). But this only reflects the similar tension on this side of the Atlantic.
There was a classic moment of joined-up government in this country back in October of last year when the TalkTalk breach debacle blew up – and the company was forced to admit its customers' data was not generally encrypted on its storage systems (interestingly, data breach is one area where Europe is still catching up with the US in terms of tougher sanctions for organisations). When embattled chief executive Dido Harding pleaded with the public that encryption was not a legal requirement, culture minister Ed Vaizey thundered back that perhaps it should be. "Companies should encrypt their information and there has been some misinformation that the Government is somehow against encryption," Vaizey told Parliament.
That same week, however, Theresa May had been busily defending the powers mooted in the draft Investigatory Powers Bill – which many observers felt amounted to a de facto encryption ban for companies processing communications and other personal data which might be of interest to law enforcement. May required "the ability to intercept the contents of communications... [and] use of equipment interference powers to obtain data covertly from computers", which many read as incompatible with adequately sophisticated encryption. This commentary was officially rebuffed, or in any event there was a partial backtrack – no doubt once the different departments and the ICO had actually talked to each other. But mixed messages persist.
Last month, May clarified the position again: "The government doesn't need to know what the encryption is, doesn't need to know the key to the encryption, but if there's a lawful warrant requesting certain information then it is about that information being readable." Part of the problem is that different security considerations apply to different data issues: encryption of stored data is one thing, and the ability to intercept messages quite another. Moreover, for all the widespread "Big Brother" concerns, government can be well behind the curve in the relevant skillset compared to the sophisticated hacking and anti-hacking frontline – which is perhaps another reason why obtaining court orders is such an inconvenience.
Self-proclaimed "cyber security legend" and tech libertarian John McAfee (yes, he of the intrusive anti-virus software that keeps asking you for updates) certainly takes that view. In response to the Apple letter, this week he published a scorching op-ed offering the use of his own crack team to do the job of the FBI, so that big government need never get the keys to break the code (or as he put it: "if the government succeeds in getting this back door [to the latest iPhone operating system], it will eventually get a back door into all encryption – and our world, as we know it, is over"). By his argument, there are enough 'bad apples' out there to mean a universal encryption key to popular systems, once created, would open the Pandora's Box of privacy once and for all.
Despite the faint whiff of publicity stunt, there is no doubt that Apple's concerns are real – whether they have commercial interests or higher principles at their heart. The arguments, as ever, tend to polarise into extremes on both sides: national security hawks versus cyber security hawks, statists versus libertarians, the digital generation and the people they consider Luddites. The FBI claim they only want Apple to develop a one-use tool, but the tech community insist – just as they did about Theresa May – that they simply don't know what they're talking about.
Whether that is more or less reassuring than a surveillance-hungry government that knows exactly what it is doing depends, perhaps, on your politics. But for businesses in the UK, especially those handling large quantities of personal data (or any sensitive or financial data), cyber security remains the organisation's legal responsibility. Encryption, as well as password protection, of mass storage devices is always to be recommended until the unlikely event that the law tells us to stop – and brings the warrant to back it up.
If you require further information on anything covered in this briefing please contact Owen O'Rorke (firstname.lastname@example.org; 020 3375 7196) or your usual contact at the firm on 020 3375 7000.
This publication is a general summary of the law. It should not replace legal advice tailored to your specific circumstances.
© Farrer & Co LLP,