Public sector personal data sharing: framework and principles

This report considers frameworks and practices of providing access to personal data by public sector organisations to private organisations.


In this report, we look in detail at frameworks and practices of providing access to personal data by public sector organisations to private organisations. Currently this practice is extremely rare as it involves considerable legal, moral or ethical risks, including damage to public trust in the public sector. Data protection laws, such as GDPR, serve to balance knowledge exchange and innovation with the protection of personal data, and it is important that one is not abandoned in pursuit of the other.

This report is focused on data held within the Scottish public sector and therefore on the pathways for data sharing that are permissible under GDPR legislation. GDPR requires that 1) consent be obtained from the data subject to allow the processing of their data for new secondary purposes or else 2) the data controller must find another legal basis through which to justify the reuse of the data. In this report, we outline two broad pathways adopted for facilitating personal data sharing by public sector to private sector: first and most common, data sharing agreements, and second, extra legislation surrounding data sharing. We also discuss a third pathway being developed for AI applications.

Across the UK, Europe and other countries around the world, the most common pathway for sharing personal data involves identifying public interest and subsequently drawing up a data sharing agreement. This pathway is currently predominantly used to facilitate sharing personal data held by the public sector to accredited research organisations. For instance, in the UK, the Office for National Statistics Secure Research Services provides access to anonymized unpublished data to accredited researchers, and in Scotland, access to health data is managed through the NHS Scotland Data Safe Haven. At present, there are very few examples of personal data held in the public sector being shared with the private sector, and where this has been done, or is being proposed, the pathway to enable it is the same as that currently used for giving access to researchers (some examples of these public-private agreements are in section 4b).

The second pathway, extra legislation, is often needed to further facilitate or restrict personal data sharing, given the legal requirements outlined in GDPR and the DPA 2018. The Serious Crime Act 2007, for instance, permits the disclosure of personal data and sensitive data by public authorities to specified anti-fraud organisations for the prevention of fraud56. The Commissioners for Revenue and Customs Act 2005 limits data sharing by HMRC to others. Outside of the UK, in Finland the Act on the Secondary Use of Health and Social Data provides a separate legal framework for reusing health and social data, with an amended the pathway for the sharing personal data held by public sector. A separate permit authority will be set up – Findata – that will enable a centralized system for the issuing of the data requests and permits, rather than requiring sharing agreements with each data controller (as is the case in the UK). This Finnish framework is still in the relatively early stages with permits so far only issued for the secondary use of healthcare data.

Finally, the application of AI is creating new demands for larger-scale data sharing, and as such is creating demand for an alternative third pathway to data sharing. Much of the legislation governing the use and application of AI is still in the early stages. Where AI technology has been used for the analysis of big data in healthcare, data has either been accessed through an individual data sharing agreement (as in DeepMind's access to eye scan data), or it is based on obtaining consent from the individual whose data is being accessed. New draft legislation from the EU may provide scope for an alternative model. The proposed AI Act includes details for the development of AI regulatory sandboxes, detailing terms for the re-use of personal data in the sandbox. The AI Act is still under discussion by European member states, however it nonetheless represents potential for a separate pathway for the re-use of personal data in AI applications.

While personal data held by the public sector may have the potential for generating great insights, there remain technical barriers to data sharing, as when data do not harmonise across agencies, and legal barriers, especially where actions are governed by multiple areas of legislation (as is the case with health data). Much of the public around the world remain concerned about the use of their personal data by public authorities and about private sector access to data. Existing pathways for data sharing with researchers – and by implication, other parties – can be improved by creating shared data standards and protocols across agencies, demonstrating public value and involving the public in the designs of infrastructure and data sharing models, marketing the value of data sharing to immediate stakeholders and users, developing a central resource that facilitates data sharing and makes these procedures transparent, and sharing ethical standards and best practices internationally.



Back to top