Skip to main content

Current Chapter

Current chapter – Barriers


Our research question was: “What are the barriers and enablers for patients and the public participating in sharing personal healthcare data?”

This section of our report looks at the barriers, which we have broken down to:

  • public understanding
  • public engagement
  • choice and control
  • demographic differences

Public understanding

Many studies have made the case for education and awareness-raising among the general public to address their anxieties about the use of healthcare data.

In general, public understanding is poor – Ipsos Mori, for example, have made the point that:

“When it comes to patient awareness and understanding of health data in the UK, the public in the report often knew little about some key areas:

Not aware of the current range of uses of health data, beyond in their own care

Underestimate the amount of data currently collected and used in healthcare

Do not understand why the NHS would need/want to allow commercial access to data, do not know how the commercial sector contributes to healthcare currently

Little understanding of the status quo when it comes to safeguards (some that participants want are already in place)

Confusion about the specifics of data and data science in general (such as the difference between anonymised versus identifiable data, or definition of aggregate data).”

This is not new. As long ago as 2014, local Healthwatch were commenting on public reactions to the Care.data initiative, with statements such as: “The debate showed that public awareness of the ‘pros and cons’ of the scheme is limited. This is of obvious concern, as the scheme relies on people having sufficient information to allow them to exercise their choice to opt out of sharing their data, or indeed to agree to the default opt-in position. Many people do not feel sufficiently informed to exercise this choice….”. The report went on to say that “NHS England has much more work to do in terms of clarifying and explaining the complexities of care.data — both in principle and practice”.

There is a legal angle on this: “The law requires ‘fair processing’ – patients must be informed of the uses of their data but sometimes they are not. There is a lack of awareness about how patient data is used, or by whom, and that patients can opt out. People who might otherwise be willing to share information may be less willing to do so if they are unable to either give permission or be informed and able to opt out” 33.

Equally important is the commercial angle. In 2017, Dame Fiona Caldicott said “One of the things that worries members of the public is what use their data might be put to that involves making a profit for somebody other than the health service” 34.

Caldicott went on to state “We have quite a lot of education to do, not least with the professions that look after patients and with the public themselves, in explaining the benefits of this and giving reassurance that it is not going to be profit for companies they do not feel comfortable having access to their data, and making absolutely clear that this is safeguarded through anonymization and that it comes back into the national or public good” 35.

Such conclusions have been echoed by the King’s Fund and others: “People generally have relatively little knowledge about how the NHS and commercial organisations use data for health research, which may be responsible for mistrust in some cases. Transparent public dialogue is needed about how data is currently used; what the opportunities are for the future; and how risks can be mitigated” 36.

Ethical considerations of data sharing were also addressed via a paper which stated “… there was generally found to be low public awareness of current research practices and in particular, of current governance or ethics processes. As such, in a number of studies it was reported that public acceptance increased after participants were informed about existing safeguards and governance mechanisms” 37.

Technical issues also come into play with regard to public understanding. For example, “The way that personally identifiable data could be translated into depersonalised and aggregate data was not understood… Some struggled to understand how aggregated datasets could give any useful learning about individuals” 38.

33 Citizens’ Juries and Jefferson Centre, 2018. Use of Free-text Health Data. A report of a citizens’ jury designed to explore when and how free-text data in patient records should be used.

34+35 House of Lords, 2017. Select Committee on Artificial Intelligence, Evidence session no. 14, Questions 128-142.

36 Castle-Clarke, S., 2018. What will new technology mean for the NHS and its patients? Four big technological trends. The Health Foundation, the Institute for Fiscal Studies, The King’s Fund and the Nuffield Trust.

37 Aitken, M. et al., 2016. Public responses to the sharing and linkage of health data for research purposes: a systematic review and thematic synthesis of qualitative studies. BMC Medical Ethics.

38 Castell, S. et al, 2018. Future data-driven technologies and the implications for use of patient data. Dialogue with public, patients and healthcare professionals. Ipsos Mori.


Public engagement

The studies above point to a need for better public education and dialogue. Clearly this depends on public engagement – but past experience has revealed some limitations with existing practice.

Some barriers arise from poor planning. At the time of Care.Data, one local Healthwatch reported that “Unfortunately the pathfinder programme has been hampered by delays and lack of materials (such as the letter and leaflets) to use in the engagement of local groups. This has undermined the engagement’s credibility and effectiveness thus far. These delays… have led to real difficulty for us locally as we have had to change engagement plans, cancel public meetings and contact groups and stakeholders to change information that we originally gave them” 39.

Another found that “A key theme raised at the outset was the question of public awareness of care.data and, specifically, the information leaflet and the video which were used by NHS England to inform the public… The leaflet sent out by NHS England was criticised for being sent out alongside junk mail”. It added that “NHS England acknowledged that these issues had been raised frequently at public listening events, and that NHS England is aware it has not informed the public as much as it should have done” 40.

Six years later, the Ada Lovelace Institute identified similar issues: “As trials of the UK contact tracing app and muted success of other apps around the world have shown, failing to engage with the public can lead to vital gaps in understanding of what determines the successful roll-out of a data-driven health tool. To help address these gaps, deeper engagement with informed publics is needed”.

Inappropriate language can also get in the way. One paper stated that “The current language landscape around the use of patient data in care, treatment and research is difficult, complex and confusing. And current attempts to come up with alternatives have fallen short. This acts as a significant barrier to having open discussions with the public about the use of data in ways that can build both understanding and trust” 41.

Another stated that “Understanding is made more difficult by the complexity of the subject, and the unfamiliarity of the language. For example, participants struggled to understand the difference between anonymous and pseudonymous data” 42.

One local Healthwatch, looking at efforts to collect data from GP records, found that “Several groups felt that the introductory page was unwieldy, containing too much information which made it unappealing to read or difficult to digest. In terms of presentation, one person fed back that their first impression was that the form currently “does not look like one that you can trust” 43.

Any form of public engagement nowadays needs to take account of the influence of misinformation. A study looking at why people chose to not use the NHS COVID-19 app found that “Reasons included: – the (false) perception that the NHS COVID-19 app was run by a distrusted private company (Serco) rather than by the NHS – lack of trust in government competence and public health response overall – concern about being monitored – phones’ vulnerability to hacking could be increased when their Bluetooth was switched on”.

Finally, it should perhaps go without saying that public engagement should be inclusive. There was very little in the literature about barriers and enablers for specific groups within society in respect of engagement on healthcare data – however, Healthwatch England have made a useful point about public engagement and carers: “Carers are often ‘forgotten’ because they are caring for someone and have also neglected the need to ‘opt-out’ if necessary” 44.

39 Healthwatch Hampshire, 2015. care.data West Hampshire Pathfinder Engagement Report.

40 Healthwatch Essex 2014. Care.data: the debate. Summary report.

41 Good Business, 2017. Patient Data. Finding the best set of words to use.

42 Britain Thinks, 2015. Secondary Uses of Healthcare Data Public Workshop Debrief.

43 Healthwatch Lambeth and NHS Lambeth Clinical Commissioning Group, 2016. Lambeth DataNet: Individual Patient Registration Profile Community Consultation.

44 Healthwatch England, 2015. Written evidence on Public Attitudes on Consent and Data Security for the National Data Guardian Review Team.


Choice and control

It may come as no surprise that “Perceived autonomy, or individual control over how data is used, was found to be a key factor shaping public responses”. Indeed, “Members of the public value having control over their own data. Participants explicitly referred to control over their own data in terms of individual or human rights. There was an evident link between levels of trust (in research organisations or data controllers) and desired level of individual control” 45.

A key driver for concerns over choice and control is the question of data privacy. This has been recognised by the Department for Health and Social Care: “The primary reason for not downloading the NHS COVID-19 app were concerns around privacy and not wanting to use the app”. Their report recognised that “building trust in data privacy was the biggest hurdle in NHS COVID-19 app adoption” 46.

Worries about privacy have been noted elsewhere: "members of the public value opt-in consent and can perceive research without consent as an invasion of privacy” 47. Such concerns influenced reactions to the Care.data scheme: “The chosen opt-out system was seen by some members of the public as unethical. The default system, which means that data will be extracted unless a patient explicitly rejects, caused much concern. The opt out system relies on everyone knowing about the scheme, being able to make an informed decision and being given the choice to opt-out if they so wish” 48.

For some people, privacy concerns are directly related to their own health conditions. For example, “When it comes to the use of social care services, mental health status and sexual health status, a low number of respondents reported being happy to share this information with health professionals and NHS services (39%, 38% and 34% respectively)” 49. Furthermore, “Minors & vulnerable people have special considerations for permission and consent” 50, and “Sharing sensitive data (mental health, sexual health, sexuality, religion) holds greater concern than for other types of data” 51.

Third parties’ motives for wanting access to data is another cause for concern. A Healthwatch survey on sharing of medical records found that “The most common area of concern with respect to record sharing was access by third parties”.

Commercial companies were a common focus of anxieties about third party use of data: “The fear of data being misused by companies for their own gain was very apparent in all groups. Concerns were mainly around insurance companies obtaining health information which may affect their premiums or cover, or companies using the information for targeted advertising” 52. Insurance companies were also mentioned in a Healthwatch study: “Fear that personal data will be shared with private companies (especially insurance companies)” 53.

Concerns about profiteering is also a factor: “Public trust is undermined when a data access partner is seen to profit excessively from realising the potential from NHS patients’ and NHS operation data, and/ or the expected social value outcomes do not emerge from the agreement” 54.

Alongside data privacy, questions of data security can affect people’s perceptions of how much control they have over their own data: “85% of people in one survey knew about the Wannacry hacking scandal and 53% said their confidence in the ability of the NHS to handle data was negatively affected” 55.

Concerns about data security are as much about human error as about the strength of IT systems: “… concerns related to the fallibility of IT systems to protect against breaches as well as to human error. Media reports of “laptops left on trains” or misplaced data were widely called upon to illustrate this latter point” 56.

The point about “media reports” ties in with an observation that “The media typically only analyse the process of using health data when something goes wrong, such as a data breach” 57. And a citizens’ jury exercise found that “Some jurors mentioned that news stories heard in the press about privacy and confidentiality breaches could act as a deterrent to wider public support for data access” 58.

Concerns over choice and control could perhaps be mitigated by effective regulation. But that might depend on the extent to which regulation is seen as keeping pace with technological developments. One study notes that the “need for [regulatory] clarity is becoming ever more pertinent with the speed of technological developments, and consumer healthcare apps using AI... Following complaints about Babylon’s symptom checker the Care Quality Commission (CQC) referred them to the Medicine and Healthcare products Regulatory Agency (MHRA). However, five public bodies were responsible for assessing the impact and safety of Babylon’s products (CQC, MHRA, NHS Digital, NHS England, and Hammersmith and Fulham CCG), demonstrating the lack of clarity that currently exists in the oversight of technology products such as chatbots. The report states that “This situation does not provide the general public the clarity that is required” 59.

45 Aitken, M. et al., 2016. Public responses to the sharing and linkage of health data for research purposes: a systematic review and thematic synthesis of qualitative studies. BMC Medical Ethics.

46 Department for Health and Social Care, 2021. NHS COVID-19 app: early adopter evaluation report NHS Test and Trace programme.

47 Hill, E. et al., 2013. “Let’s get the best quality research we can”: public awareness and acceptance of consent to use existing data in health research: a systematic review and qualitative study. BMC Medical Research Methodology.

48 Healthwatch Essex 2014. Care.data: the debate. Summary report.

49+50 Healthwatch England, 2015. Written evidence on Public Attitudes on Consent and Data Security for the National Data Guardian Review Team.

51 Understanding Patient Data, 2018. Public attitudes to patient data use. A summary of existing research.

52 Hill, E. et al., 2013. “Let’s get the best quality research we can”: public awareness and acceptance of consent to use existing data in
health research: a systematic review and qualitative study. BMC Medical Research Methodology

53 Healthwatch England, 2015. Written evidence on Public Attitudes on Consent and Data Security for the National Data Guardian Review Team.

54 Hopkins, H. et al. 2020. Foundations of fairness: views on uses of NHS patients’ data and NHS operational data. A mixed methods public engagement programme with integrated Citizens’ Juries. Hopkins Van Mil.

55 Understanding Patient Data, 2018. Public attitudes to patient data use. A summary of existing research.

56 Aitken, M. et al., 2016. Public responses to the sharing and linkage of health data for research purposes: a systematic review and thematic synthesis of qualitative studies. BMC Medical Ethics.

57 Understanding Patient Data, 2021. Analysis of UK reporting on health data.

58 Hopkins, H. et al. 2020. Foundations of fairness: views on uses of NHS patients’ data and NHS operational data. A mixed methods public engagement programme with integrated Citizens’ Juries. Hopkins Van Mil

59 All Party Parliamentary Group on Heart and Circulatory Diseases, 2019. Putting patients at the heart of artificial intelligence.


Demographic differences

We should not assume that all groups within society have the same attitudes towards sharing healthcare data. Evidence suggests that there may be some differences, as follows.

There may be some differences between women and men in attitudes towards data sharing. One study found that “From the quantitative literature, males and those who were older seemed more likely to consent to a review of their medical records”. However, it adds the caveat that “this was not confirmed in a meta analysis of 17 international studies” 60.

An Ipsos Mori study asked 2,000 people how important it was that the NHS treats a patient’s medical records as confidential, and found that “Women are more likely than men to say this is very important (89% vs 85%)”.

There might also be some age differences in data sharing acceptance, although conclusions are again not unanimous. The Ipsos Mori report cited above states that “people aged 65 and over are less likely to say the NHS’s treatment of patients medical records as confidential is important (95% vs 98% overall). Those aged 35-44 are the most likely of any age group to say it is very important (91% vs 87% overall)”. Furthermore, “Older people were generally more willing to have their records shared between the professionals involved in their care. In focus groups with over 65s there were comments from older people about the difficulty of remembering all the relevant details of their health history”.

Conversely, evaluation of early adopters of the NHS Covid-19 app found that “16-24 year olds were significantly more likely than older age groups to be confident that their data would be handled securely”. It has also been stated that “Younger people are much more familiar with the concept of big data – and the technology that generates it – than older people. This means that they were much better able to imagine the benefits that secondary uses of healthcare data might bring” 61.

Yet another study asked respondents whether they would support their health data being accessed by commercial organisations if they are undertaking health research. It found that “There was not a strong level of support across any of the age groups”, but then added that “older age groups were much more resistant” 62.

A third factor is social status. In this respect, one study found that “ABC1s were more likely than C2DEs to view the use of health data as having a potential benefit to society, in the fields of research, disease prevention, planning of services, crime prevention and so on”. A further observation was of “C2DEs feeling more powerless to deal with consequences, e.g. arguing their case if their identity were stolen”. According to this report, “Any linking resulting in the individual being targeted with specific messages prompts discomfort and resistance. The expectation is that blame and desired behaviour change will be implicit… The lower socio-economic classes can feel particularly defensive” 63.

60 Hill, E. et al., 2013. “Let’s get the best quality research we can”: public awareness and acceptance of consent to use existing data in health research: a systematic review and qualitative study. BMC Medical Research Methodology

61 Britain Thinks, 2015. Secondary Uses of Healthcare Data Public Workshop Debrief.

62 Castle-Clarke, S., 2018. What will new technology mean for the NHS and its patients? Four big technological trends. The Health Foundation, the Institute for Fiscal Studies, The King’s Fund and the Nuffield Trust.

63 Wellcome Trust, 2013. Summary Report of Qualitative Research into Public Attitudes to Personal Data and Linking Personal Data.


Last edited: 23 May 2023 2:01 pm