Public sector personal data sharing: framework and principles

This report considers frameworks and practices of providing access to personal data by public sector organisations to private organisations.


4. Issues and Barriers to Personal Data Sharing

As the last two of these pathways are yet to be implemented at scale, there is limited public evaluation of these processes. Given that the proposed pathways for public to private sector sharing would broadly follow the model currently used for the re-use of data by academic research, a short summary of the issues identified from this sector are provided below.

Technical, Legal and Economic Barriers

While personal data held by the public sector may have the potential for generating great insights, there remain technical barriers to data sharing. For example, data held by different public bodies is often not harmonized and scope for analysis can be limited by the quality of the underlying data, such as when they lack metadata and standards (van Panhuis 2014), while different public bodies may have different security requirements (UK Centre for Data Ethics and Innovation 2020).

Alongside this, there is often considerable confusion and uncertainty over the legal frameworks, both from the general public and from those involved in the negotiation of data sharing agreements, especially where actions are governed by multiple areas of legislation (Big Data Value Association 2019, NHS Digital 2006, Savirimuthu 2021).

In Finland, the government has addressed some of these technical and legal challenges through Findata, which manages the technical needs and data context. Findata told us, for instance, that they worked with data controllers to identify local and national data standards and standards across agencies, which they then worked to harmonise.

A final concern are economic barriers, should data sharing require payment to the data controller or other agencies. Our interviewee with Findata expressed concern that costs set by perceived value and demand could limit access to certain stakeholders, such as the public sector, third sector and universities.

Organisational Transparency and Cultural barriers

The re-use of personal data will usually require approval from an ethics council, public benefit panel or similar internal assessment structure. In some cases, these assessment structures may not be required to provide feedback on the application, and the reasoning behind a decision will not be made available to those requesting the data, as described by Adams & Allen (2014) regarding requests made in Australia:

Government data custodians are ultimately responsible for the confidentiality and security of the health information datasets they hold and for making the final decision on whether or not to release such information. Like Human Research Ethics Committee's (HRECs), government data custodians consider the risks and benefits associated with disclosing the information but, unlike HRECs, they are not required to provide reasons for rejecting a proposal. These decisions go largely unscrutinised by any independent authority and unchallenged by researchers. (p. 958)

This situation may be further complicated by different bodies' requiring different approval processes (Cavallaro 2020) or having varying willingness to share data, even when it is legal to do so. This hesitancy can be due to organisational constraints, such as the cost or resources needed to enable sharing, or wider cultural factors, such as concerns over the impact on reputation and potential damage to the public authority (Laurie & Stevens 2016) and different attitudes to risk held by public and private sectors (Mikhaylov et al 2018). The Population Research Network (Australia), for instance, cited the issue of conservative data custodians and controllers who, if in doubt, say "no" to releasing data. PRN said it is usual for custodians and controllers to feel very unsure whether they should or could release data, and that there are widespread assumptions that private sector organisations are not allowed to access public data at all.

In the case of Findata, they created a live website to address such issues of transparency that shows all their applicants, what the application was for, what the decision was (whether approved or rejected), any costs to access, what the time limit of the approved permits was, as well as outputs and outcomes of any project. Our Findata interviewee also recommended an early strategic communication strategy with stakeholders who would use the service, to communicate its purpose and value to research and innovation up front. Our interviewee with Population Research Network (Australia) also recommends creating concrete policies and acts within government that give political accountability to the transparency and availability of data for the public interest.

Public Trust

Factors for organisational reluctance to share data can also be linked to wider issues over public trust. Evidence from surveys continues to suggest that much of the public around the world remain concerned about the use of their personal data by public authorities (The Paypers 2020), and about private sector access to data (Scottish Government 2013, Ipsos MORI Social Research Institute 2016, Street 2021, Biddle 2018). In the UK, the DeepMind and Royal Free London NHS Foundation Trust controversy[8] and the failed care.data project[9], serve as reminders of both the risks of data sharing but also the difficulties of successfully communicating and maintaining public trust in sharing personal data (Godlee 2016).

A notable exception to this can be seen in results from Finland, where public trust in others and government is high. In 2019, data from the OECD found that 64% of the Finnish population reported trusting the government, noticeably higher than the OECD average of 45% (OECD 2021). While levels of trust were seen to vary by institution, trust in the civil service and national government was high. Data from other surveys has also previously shown there is a high trust in public social and health services (Sitra 2016) and that a higher proportion would be willing to share information about their health if it were to be used for scientific research than in other European countries (Vanska & Halenius 2019). While assessing public trust in data sharing continues to be hard to measure, these statistics do give some insight into why Finland has been able to implement the extra legislation highlighted in Section 3.

High levels of public trust on their own should not be used as justification for data sharing though, as highlighted by comments made in the ethics advisory report for the National Data Analytics Solution, which stated:

"people's 'comfort' with something should not be considered to be weighty evidence as far as the ethics of data analytics is concerned or, indeed, in the light of a recognition that the legal rights of individuals (most of whom are likely to be innocent) are involved." (The Alan Turing Institute 2017, p. 15).

At a technical level, confidence and privacy can be maintained through appropriate privacy enhancing technologies, such as federated learning, which lets an authority design machine learning models without needing to store individual's data in a central location. According to the UK government's Centre for Data Ethics and Innovation, the lack of public trust could also be overcome in part by giving citizens more agency to control the data held about them and making clear when it is appropriate and in the public interest to share data, especially when it is shared without consent. A CDEI report on 'Addressing Trust in Public Sector Data Use' calls for 'clear principles for determining what constitutes public interest in the sharing and use of data' and clear rules on protecting privacy even when sharing is deemed in the public interest' (UK Centre for Data Ethics and Innovation 2020, n.p.).

One non-standard approach that can address public trust are consent-based models of data-sharing (Shahaab & Khan 2020). However, there was no evidence to suggest that this is used to facilitate the sharing of data to private organisations. Another example of consent-based sharing is using personal data stores and other personal data service companies. These companies enable individuals to manage their own personal data in secure ways.[10] At present these intermediaries offer specialized services and are not a standard approach for the sharing of personal data that is held by public sector.

From the literature, we find that existing pathways for data sharing with researchers – and by implication, other parties – can be improved by creating more awareness among agencies about the different legislative requirements placed on their data, streamlining and increasing public transparency around the decisions behind accepting or rejecting data sharing agreements, and exploring consent-based models of data sharing that address issues of public trust.

A large part of the interviews also discussed the need for legitimacy and trust across data sharing processes. This ranges from the trust with government itself, to trust in the technical process and also trust that outcomes will be useful, valuable, safe and worth the risks. Our interviews showed that public trust is a matter of national context. In Finland, Findata came about in large part because of the government's long history of using data for the public good and being able to clearly define the value (our interviewee cited that this track record goes back to the cancer registry in 1954, which had clear public value). Nearly all interviewees (GovTech Polska, Mydex, FinTech Scotland and Findata) said data sharing processes and data sharing models should be legitimised by the public and stakeholders through a consultation, at a minimum, or through more advanced methods of civic engagement, such as co-design. Both GovTech Polska and Findata, for instance, carried out public consultations on data sharing infrastructures and strategies; our Findata interviewee favoured meaningful dialogue and conversations with the public, over more performative or transactional consultations.

Many of our interviewees (Population Research Network - Australia), GovTech Polksa and Fintech Scotland) also all expressed a desire to share best ethical practices across national borders, pointing out that this would help create broader harmony of standards and an effective community of practice.

Contact

Email: sophie.Ilson@gov.scot

Back to top