Amazon, Alexa and healthcare data


16th December 2019

In July 2019 NHS Digital, the body responsible for technology and data in the NHS in England, announced that it was making NHS information available for use with Alexa devices, Amazon’s voice-activated personal assistant.

The aim was to ensure that anyone searching for medical information via Alexa would receive information developed and approved by the NHS so that our health data is safe, approved and accessible.

Amazon is just one of over 1,500 organisations that have access to NHS information. But it’s undoubtedly the most controversial. Privacy International, a charity which campaigns for greater privacy rights, made the headlines recently when they revealed that Amazon was not paying for any of the NHS information. This story appeared to be the perfect storm, bringing together the recent controversies over Amazon’s payment of taxes in the UK, fears about how the tech giants are using our personal information, and the general election themes of NHS privatisation and potential trade deals with the US. Campaigners saw this as evidence Amazon was getting preferential treatment, or that this was part of some secret plan to sell off the NHS to US interests. The Good Law Project has launched a state aid challenge to the Department of Health and Social Care, arguing that the Amazon arrangement is ‘excessive’ and they should pay for the information.

But are we right to be concerned? Well, on the face of it, the answer appears to be ‘no’. The Amazon contract with the NHS is for the provision of information that is already available on the www.nhs.uk website and made available free of charge to other companies. It is clearly a positive thing that those using voice searches can access accurate and reliable information, particularly about the most personal issues such as health questions. NHS Digital makes the point that voice searches can actually be more accessible for people with disabilities and may, therefore, bring NHS information to a wider audience.

Looking a little wider, however, there are valid reasons to be concerned about the growing popularity of devices such as Alexa. In common with many new technologies, Alexa devices are designed to capture a vast amount of information about us. Indeed, our data drives a multi-billion dollar industry, and health data is a particularly valuable commodity. The more companies know about us, the better they can target us with services and advertising. Unlike traditional browser-based searches, voice searches are linked directly to known individuals, so Amazon will know what each user is looking for, including when they search for very sensitive information such as checking symptoms of particular illnesses. Press reports have revealed that Amazon and other tech companies routinely use humans to listen to recordings made by Alexa-type devices. And malicious apps have been discovered which allow others to eavesdrop in your home without your knowledge. These are real privacy worries.

And while the Amazon contract appears to show only publicly-available information will be shared, this isn’t always the case. Various NHS bodies have licensed this information to others, including private companies such as Google, for medical research purposes. These partnerships may lead to significant benefits for patients, including new and better treatments. Companies carrying out research generally do not need to know the identity of individuals, and so most research data is anonymised, but even this is not entirely without privacy risks. Medical information is extremely sensitive, and we are entitled to trust medical professionals to keep it safe and not use it for commercial gain. In an age of big data, where the tech giants know so much about all of us, there are concerns about the possibility of re-identifying anonymised data and then making decisions about us based on our health. Carried on unchecked, this could lead to dystopian consequences, such as higher insurance premiums or denial of access to services for individuals with particular health conditions.

Such concerns may well be overstated. After all, it is a criminal offence under data protection law to re-identify information which has been anonymised without the consent of the relevant data controller, and there are additional safeguards in place to protect personal data relating to an individual’s health. Nevertheless, as the recent Amazon controversy neatly illustrates, many people remain instinctively uncomfortable about the commercialisation of their health data and the potential erosion of their privacy. These sorts of controversies are unlikely to go away.

If you need advice regarding data protection and health data, contact our specialist lawyers.

Enjoy That? You Might Like These:


articles

7 November -
What does the eliminating profit agenda of the Health and Social Care (Wales) Bill (“the Bill”) mean? On 20 May 2024, the Welsh Government published the Bill, which we are... Read More

case-studies

30 October -
The Vice President of the Court of Protection has issued guidance in the case Leicestershire County Council v P and Anor [2024] EWCOP 53 relating to fluctuating capacity, when anticipatory... Read More

events

28 October
Our Public Sector Insights webinar on Thursday 12 December will focus on data protection and information governance. Read More