It is a fundamental principle of UK and EU data protection law that any processing of personal data must be fair and lawful. As health data benefits from enhanced protections afforded to particularly sensitive categories of data, what counts as 'fair and lawful' requires careful balancing of a number of factors. These include a high standard of transparency, openness, and acting within the expectations of individuals.
These requirements have been thrown into sharper relief by the huge impact of the pandemic on the way the entire health sector looks at data. The focus is now, more than ever, on the potential for artificial intelligence and computer modelling to serve public health goals by leveraging big data and drawing links between datasets (as discussed here).
For the private sector, this offers a real opportunity to generate commercial value from understanding and profiling individual behaviour, using new data linkages for research and development, and gathering unique insights using large populations. For the public sector, it provides the alluring potential to develop more digitised and joined-up services for the benefit of public health, and to inform new policy measures based on expansive data, rather than a narrow focus on individual care.
We are at a fascinating nexus of technology and health data, with "developers" and "innovators" now being mentioned in the same breath as "service users" and "practitioners". In this context, it's easy to see sweeping uses of digital health data as 'fair game' for the purposes of public health, technology advancement and innovation.
In the middle of all this, the NHS is sitting on an unparalleled health dataset that has enormous potential value for generating commercial and public innovation and insights.
So how do we maintain fair and lawful data processing and observe the rights of individuals in such an environment?
Ensuring fairness is, at its core, a matter of following EU and UK legal principles that have been present in data protection law for decades.
The UK GDPR refines and updates the concept to be more relevant to the digital age, but the core strand remains: "personal data shall be processed lawfully, fairly and in a transparent manner in relation to individuals" (Article 5(1)).
In a nutshell, for processing of health data to meet this principle it should only be handled in ways that people would reasonably expect and, crucially, not used in ways that have unjustified adverse effects.
This means observing the following:
While these may seem familiar and well-established by now, historically practices have often fallen short.
Several major pre-pandemic issues affected public trust in the fairness and lawfulness of the processing of health data.
Care.data
Readers might recall the ignoble launch of care.data - an NHS England initiative that launched in 2013, aimed at centralising patient health and social care data. There was a failure of upfront consultation and transparency with both GPs and patients, and questionable revelations of onwards sharing of health data with insurance companies and consultancy groups. The plans were thrown into an escalating series of crises until the entire project eventually stalled.
Questions were also asked around the efficacy of an 'opt-out' process, and the ability to protect and maintain anonymity of patient data and the possibilities of re-identification. More recently, similar issues cropped up around the contested implementation and the stalling of the National Data Opt-Out (care.data's successor), and indeed it is hard to see such concerns abating any time soon as datasets grow ever more prevalent and analysis techniques become more and more sophisticated.
DeepMind
More recently there were widely reported failures of fairness with the 2016 engagement by an NHS Trust of the Google DeepMind service. This involved the passing of medical information from the Trust to DeepMind for the development of a clinician support app, however this sharing was carried out without proper consultation and notification to individuals. The sense was that patients would not reasonably expect their data to be used in this way and the issue highlighted public concerns around large-scale access to, and use of, private health data by technology companies.
After the Trust was sanctioned by the ICO, the clinician-support app was eventually scrapped. The concerns highlighted by the Trust's use of DeepMind persist today, with a great deal of scrutiny from privacy specialists being given to the NHS COVID contracts with AI/tech firms such as Faculty and Palantir.
When the pandemic hit, the public broadly backed the idea of urgent and necessary processing by health authorities where there was a direct public health benefit. This period saw the creation of the COVID-19 National Data Store as a government database of health information that could be used to get the measure of hospitalisations, critical care bed and ventilator availability.
The dataset required the initial processing of patient identifiable clinical information before being anonymised and uploaded, but clear health-focused rationales and legal bases were developed and communicated with comment from the regulator and the publication of third party contracts. On that basis, the initiative went ahead, albeit with a great deal of scrutiny, but with an understanding that this was a fair way of processing in a difficult context.
While the ICO took a pragmatic approach during the pandemic, that did not mean it was prepared to overlook non-compliance. In March 2022, the ICO issued a reprimand to the Scottish government and NHS National Services Scotland relating to GDPR failings in relation to sensitive health data used by the NHS Scotland COVID Status app. For a brief period, the app sought consent from users despite the fact that the processing was not predicated on consent. The ICO said this breached the fairness principle by suggesting that users had a greater level of control over their data than was the case.
With the impressive progress of technology and the urgent necessities of the pandemic response it may be all too easy to forget the lessons of the past, and to strive to open as widely and as innovatively as possible the ability of health services to collect and share data.
Indeed, the UK government is looking ahead and consulting on digital reforms "to unleash the unlimited potential of data in health and care" and to create a new "duty to share" to make data sharing the norm across healthcare services, rather than operating in protective silos (with the accompanying inefficiencies). In March 2022, the Secretary of State for Health and Social Care, Sajid Javid, made a speech setting out an agenda for technological innovations in the UK's healthcare system. These include expanding the rollout of electronic patient records in NHS trusts up to 90% by 2023, and more widespread adoption of the NHS app, up to 75% of adults by March 2024. The government is expected to publish a digital health plan later this year. Among other things, this will cover the use of NHS data to drive innovation.
However as conditions of urgency start to shift back to a health system with more time and resources available to allocate to patient privacy, we are already seeing patients and GPs call for fairness and lawfulness to be back at the forefront when it comes to health data.
The General Practice Data for Planning and Research programme (akin to care.data, involving the extraction of GP patient data for research and planning) has been criticised to the point where the initiative has now stalled. Familiar concerns have been raised regarding the possibilities of data being sold to commercial parties, risk of re-identification, and the fact that patients were not being given enough time to become aware of an opt-out option.
It seems there is still some way to go before there is public confidence now that the pressing needs of the pandemic seem to be subsiding.
In broad terms, data isn't processed fairly and is certainly not processed lawfully if it is processed in breach of any legal requirements. While we're not proposing to run through the entire set of requirements here, there are key practical steps controllers of health data can take to help ensure that processing of health data meets the UK GDPR requirement to be fair and lawful:
The UK government has proposed measures for added clarity in data protection legislation – "to ensure that our laws keep pace with the development of cutting-edge data-driven technologies" (Data: a new direction). Clearly, following the laws and principles that establish protections for personal data will be the ultimate ingredient in making sure processing is lawful – but it remains to be seen whether the UK can make changes while preserving what we (and our EU neighbours) understand to be fair.
Victoria Hordern looks at what constitutes health data and the lawful bases on which it can be processed under the (UK) GDPR.
1 de 7 Publications
Victoria Hordern looks at the use of big data and AI in medical diagnostics in the context of data protection and AI regulation.
2 de 7 Publications
Elisa-Marlen Eschborn looks at the opportunities created by the planned European Health Data Space, and at whether they can be realised by 2025.
3 de 7 Publications
Jo Joyce looks at cybersecurity challenges facing health data and at ways to manage risk.
5 de 7 Publications
Debbie Heywood looks at the scientific research provisions in the UK GDPR and Data Protection Act 2018, in the context of health data.
6 de 7 Publications
Victoria Hordern looks at the restrictions on exporting health data and at whether they are really proportional given that potential benefits could well outweigh any risk.
7 de 7 Publications
Retour