The Streams App and Privacy and Informed Consent Issues in the NHS

Please share this story:

One of the industries most drastically affected by advancements in technology has been healthcare in the developed world. Improved techniques, more efficient ways to detect patient care issues, and better treatment options for a myriad of illnesses and injuries are all apparent benefits of infusing tech-driven tools into hospital and clinic procedures and processes. Technology firms are chomping at the proverbial bit to increase their presence in the healthcare sector as the population grows and ages at a rapid rate, including major players like Google. Through its DeepMind Health subsidiary, Google has been at the forefront of healthcare technology for some time, but some argue the company’s placement poses risks to the patient population.

In 2016, DeepMind Health became a hot topic of discussion among many households and patient advocacy groups throughout the UK. That’s because the company’s partnership with the Royal Free Trust of London, established in 2015, was brought to light due to a glaring issue with data privacy in the healthcare arena. Originally, the NHS approached DeepMind Health in an effort to develop a technology-based application known as Streams to help fight the growing problem of Acute Kidney Injury, or AKI. On average, nearly 40, 000 deaths are attributed annually to AKI, mainly due to the surreptitious warning signs the vast medical problem presents. Rarely are there noticeable symptoms with AKI, and that leads to delayed diagnosis, missed opportunities for treatment, and potential patient harm. Streams is meant to thwart the issue of AKI by presenting clinicians with alerts on mobile devices in real-time, as opposed to the lengthy wait for laboratory results.

The partnership between DeepMind Health and the Royal Free Trust was promising before the discovery that 1.6 million patient records were transferred to the company – with identifiable, otherwise confidential details. Patients involved in the data transfer were not made aware of the partnership until long after the sharing was complete, bringing to light several issues relating to concerns over privacy in both healthcare and technology worlds.

A Breach of Data Privacy with Streams

As industries grow fonder of utilizing big data of customers and competitors to fuel business growth and innovation, it is not uncommon for enterprises to have access to millions of data points from a variety of sources. As an enterprise itself, the NHS is no different – except that there are strict laws governing the use and transfer of data relating to patients and their medical records. With the 1.6 million records transferred to DeepMind Health to help in the development of the Streams app, no one was made aware of the fact that the details included confidential information that could be used to identify patients. There was no call for public discussion about the transfer of data, nor was there an option for patients to remove their records from the transfer. This poses several threats to the future of healthcare technology.

While most individuals are not opposed to sharing information for the purpose of research or app development, in an ideal situation, consent must be given first. In the case of DeepMind and the NHS partnership, patient records were shared between the organisations without the informed consent of any individual involved, and it took months for the public to be made aware of the issue. The data, to this day, is stored and structured on DeepMind servers, owned by Google, and there is still no information as to how the patient data may be used above and beyond the development of the Streams app in the future.

Legal Implications for DeepMind and the NHS

Once it was revealed that patient records had been gifted to DeepMind Health, an investigation was launched, led by the Information Commissioner’s Office. In its final report, the ICO put blame on the Royal Free Trust, not DeepMind, as the former maintained control over the data from the start of the partnership. Based on widespread privacy rules, it was the responsibility of the Royal Free Trust to gain consent and offer transparency in the transfer of data for the patients affected. The ICO’s report included a detailed breakdown of the four data principles breached by the NHS, including a degree of unfairness, the unlawful sharing of information, and a failure to prove the necessity of the transfer.

The Royal Free Trust was made to sign an agreement that further partnership with DeepMind Health or other technology firms, whether in the development of applications like Streams or for other intents and purposes, would be done with far more care. DeepMind Health received no disciplinary action, but its leaders promised to be more aware of the privacy issues at hand moving forward. While these actions prompted by the ICO’s investigation show an effort in safeguarding the privacy of patient data in the UK, there is a far greater concern about the disregard for the confidentiality and informed consent issues in the first place.

A representative from a medical negligence law firm in the UK speaks to the significance of the actions taken by the Royal Free Trust in its partnership with DeepMind Health. While there is great promise in apps like Streams in staving off advanced medical issues and the harm patients could inevitably experience due to delayed diagnosis or untimely treatment, patient privacy should and must be at the forefront of healthcare progress. Innovation cannot take place if the erosion of privacy rights are eroded in the process, as these are fundamental protections for patients as consumers of healthcare.

It is clear that DeepMind Health and the Royal Free Trust took a glaring misstep in the transfer of identifiable data of more than 1 million patients, despite the fact that the purpose of the data sharing had the best of intentions. Looking ahead, it will become more necessary for technology firms and healthcare providers and organisations alike to take precaution in how information is gathered, stored, structured, and shared for research and technology development. Measures like a call for public comment in advance of any movement of data, or making patient records unidentifiable can both protect the individuals whose records may be involved in research or development as well as the entities involved in the process.

Previous articleUrban Drivers Will Love The 2018 Mazda 3
Next article3 Ways DNPs Can Strengthen an Ailing Healthcare System
Melissa Thompson

Melissa is a mother of 2, lives in Utah, and writes for a multitude of sites. She is currently the EIC of HarcourtHealth.com and writes about health, wellness, and business topics.