Skip to content

The NHS is this week piloting a mobile phone app designed to help prevent the spread of coronavirus, if and when the lockdown is eased.

The “NHS COVID-19” app uses Bluetooth technology to alert users who have come into close contact with a symptomatic or infected person so that they can take steps to prevent further spread such as being extra vigilant for symptoms, getting tested or self-isolating.

If the Isle of Wight pilot succeeds and enough people go on to download and use the app, it could – together with “manual” contact tracing, increased testing and ongoing social distancing - help normal life to be resumed. (60 per cent of the population will need to engage with it for it to be effective.)

However, despite the obvious potential public health benefits, some privacy advocates have voiced concerns that the NHS app is – in contrast to similar apps in some other countries – linked to a central, government-controlled database where some of the information collected would be stored.

Italy, Germany, Switzerland and Austria have chosen apps which, instead of linking to a central database, use a “peer to peer” model developed by Apple and Google, where the information gathered is stored primarily on users’ handsets. The UK, France and Norway have chosen the centralised model.

With the NHS app due to be available in the coming weeks, we take a look at its key features and status in data protection law and compare it with the models in use in other countries.

How does the app work?

Use of the app is voluntary ie it is up to the user whether or not to install it on their mobile. If you don’t like the idea of it, you just don’t install it.

Once installed, the app uses Bluetooth to detect if other phones are also running the app nearby.

It then builds a log of other devices running the app which come into proximity with it for long enough.

The app calculates how close the device it is installed on has been to other phones running the app and for how long, allowing it to build up an idea of which of these phones’ owners are most at risk.

It does this by generating a cryptographic token which is sent via Bluetooth to the devices of other app users who have come into close contact with that device.

The tokens, which are not associated with other data that would identify the device user, are stored in each app, so that each device has a way of “remembering” which other devices it has been in close contact with and when.

If a user reports via the app that they are experiencing coronavirus symptoms, the app will use the stored tokens to alert recent close contacts that they may be at risk of infection. (“Close contact” means either face to face contact or spending more than 15 minutes within two metres of an infected person, rather than someone the user has passed in the street or a shop.)

The person who thinks they may have coronavirus can also request a test and upload the result to the app.

If the result is positive, the app will then notify the user’s recent contacts that they have recently been in contact with someone who has coronavirus, so that they can take appropriate steps to prevent spread.

The app does not identify who the infected person is, although if the recipient has not recently been in contact with many people, the identity of the infected person may be obvious. The app is therefore unlikely to be popular with people who wouldn’t want anyone to know or suspect they have developed the virus – but given that most people in the UK are relatively open about this, this is not expected to act as a major deterrent.

What kind of data does the app capture about the user?

According to Matt Gould, the chief executive of NHSX, the digital arm of the NHS that has developed the app: It doesn’t know who you are, it doesn’t know who you’ve been near, it doesn’t know where you’ve been.”

In legal terms this means that the data collected is not “personal data”, because it cannot be used to identify the user.

The app does not require users to enter information which identifies them, such as their name or email address, and does not track or record their location.

Instead, the app records the make and model of the user’s phone, which is needed to accurately measure the distance between the phones of people who have installed the app.

It also requests the first half of the user’s postcode, which could be the same for between 8,000 and 12,000 households.

The app then assigns a random installation ID which changes daily, so the installation ID is private from others the user may interact with.

The data collected by this particular app is therefore much more minimal and less invasive than that routinely collected by hundreds of other apps used daily by millions of people around the world.

Apps such as Facebook, Instagram, Google and Google maps, which have long featured on millions of phones around the world, are therefore likely to represent a much greater threat to privacy than NHS COVID-19 – and don’t have the capacity to contain a pandemic.

What can other people see?

Using Bluetooth, the app automatically searches for nearby phones that are using the same app. If it finds one, the apps will exchange and securely log:

  • the other app’s daily ID
  • the date of the encounter
  • the Bluetooth signal and strength (used to estimate the distance between the phones)
  • the length of time the phones were in contact.

What happens to the data collected by the app?

The NHS app is connected to a central database.

If the user remains asymptomatic, no data is transferred to the central database.

However, if the user notifies the app that he or she has developed symptoms or tested positive, the app uploads the postcode extract to the central database and the user is assigned a numerical code.

In other words the data that is recorded on the central database does not identify the user either and, according to experts from the National Cyber Security Centre, it would not be possible to go back and identify a person from the centrally held data.

As the system currently stands therefore, privacy is built into the system – as required by the General Data Protection Regulation - and is not compromised by choosing to use the app.

Why are privacy campaigners worried?

Given that the NHS has confirmed that the app does not collect any personal data from users and that use is voluntary, it is difficult to see how this particular app poses any threat to personal privacy.

However some campaigners fear that use of contact tracing apps such as this could become mandatory and that this would be a threat to civil liberties and privacy because people could not withhold their consent to be traced in this way (as they presently can by simply not installing the app, or by deleting it if they change their mind).

They are also concerned that - even if apps such as NHS COVID-19 don’t collect or store “personal data” - this might change with time and contact tracing apps may in future be designed to gather more extensive data, which identifies the user, potentially threatening privacy and human rights.

Some fear “mission creep”, for example that contact tracing apps could ultimately be used to monitor compliance with lockdown or self-isolation rules, or eligibility for so-called “immunity passports”, although it seems unlikely at least in a democracy that a government would risk alienating the public with such intrusive measures.

Civil liberties campaigners are also uncomfortable with the idea of the data collected being centrally retained by governments, where it could potentially be misused or hacked.

Some also fear the technology could be manipulated by malicious actors – for example to falsely register positive test results to create panic.

What is the alternative to the central database model?

Some countries, including Italy and Germany, have opted for a “peer to peer” model known as “DP-3T”, where the data collected remains on the users’ handsets and does not get uploaded to a central government database. On the one hand this type of model may lead to higher engagement because potential users are not deterred by the idea of their government holding this kind of information. On the other, this model arguably represents a missed opportunity to obtain a valuable dataset for use in future research to better understand the pandemic and how it spread or was ultimately contained.

What does the UK data protection regulator say about contact tracing apps?

The Information Commissioner’s Office has actively endorsed and encouraged their use, stating in a recent Opinion: “The Commissioner is pleased that the hard work, innovation and collaboration of many different parties is enabling these vitally important contact tracing solutions to be developed, while supporting data protection compliance and good practice.”

Summary

However contract tracing apps might evolve in future, with over 3.6 million cases of coronavirus now confirmed globally, they are probably going to be sitting next to social media, news and location apps on our phones for a long time to come. As for the NHS COVID-19 app, it presently looks set to become the pandemic’s poster boy for the data protection mantra of “privacy by design and by default”.

If you require further information about anything covered in this briefing, please contact Athalie Matthews, or your usual contact at the firm on +44 (0)20 3375 7000.

This publication is a general summary of the law. It should not replace legal advice tailored to your specific circumstances.

© Farrer & Co LLP, May 2020

You may also be interested in

This site uses cookies to help us manage and improve the website and to analyse how visitors use our site. By continuing to use the website, you are agreeing to our use of cookies. For further information about cookies, including about how to change your browser settings to no longer accept cookies, please view our Cookie Policy. Click for more info

Back to Top