Legal frameworks and ethical standards workstream report: Final Report

Final Report of the Legal frameworks and ethical standards workstream of the Independent advisory group on emerging technologies in policing.


2. Legal Frameworks

2.1 In this part of the report we consider existing legal frameworks which relate to and intersect with policing and technology: in particular, equalities law, human rights, data protection and policing legislation. We look at the impact that individuals may experience on their rights as a result of police use of new technologies and we consider whether legislation provides appropriate and sufficient safeguards against risks to and impacts on rights, for individuals and communities. We also look at the observed impacts on rights in other jurisdictions given emerging policing technologies. We consider whether existing legislation is fit for purpose especially as regards future developments and whether there are any legislative gaps which need to be filled.

Relevant legal frameworks in Scotland

2.2 There are various legal frameworks relevant to policing and technology in Scotland. Here we cover some of these frameworks; they are also addressed by the research commissioned for the IAG. For example, policing legislation such as the Police and Fire Reform Scotland Act is not covered here in this report.

Human rights

2.3 As part of the UK, which is a signatory of the European Convention on Human Rights (ECHR), Scottish public entities bear the primary duty to promote protect and fulfil human rights enumerated in that document. States have a positive obligation to protect against discrimination and promote equality. The Scottish Government should place human rights at the core of how new digital technologies are used in the criminal justice system. The ECHR has some domestic effect in the UK via the Human Rights Act 1998. Section 6 of the Act makes it unlawful for a public authority to act in a way which is incompatible with a Convention right (an 'act' also includes the failure to act).

2.4 Key ECHR rights engaged by the use of new technologies include, but are not exclusive, the following:

  • Article 2: Right to life
  • Article 3: Freedom from torture and inhuman or degrading treatment
  • Article 4: Freedom from slavery and forced labour
  • Article 5: Right to liberty and security
  • Article 6: Right to a fair trial
  • Article 7: No punishment without law
  • Article 8: Respect for your private and family life, home and correspondence
  • Article 9: Freedom of thought, belief and religion
  • Article 10: Freedom of expression
  • Article 11: Freedom of assembly and association
  • Article 14: Protection from discrimination in respect of these rights and freedoms
  • Protocol 1, Article 3: Right to participate in free elections

2.5 Various pieces of domestic legislation are also relevant to implementing these rights, including the protections against discrimination and the Public Sector Equality Duty in the Equality Act 2010.

2.6 The impact and type of right affected is dependent on how new technologies are designed; the purpose and context in which they are used; and the safeguards and oversight systems in place. There are clear human rights obligations that apply in this area derived from the Human Rights Act 1998, and international human rights law, together with data protection and non-discrimination duties that derive from the Equality Act 2010. There is an emerging body of human rights jurisprudence on the development and use of digital technologies and the need to be taken within a human rights framework, this means considering cross-cutting human rights principles such as transparency, non-discrimination, accountability and respect for human dignity. It is also crucial that the private sector meets its due diligence obligations to ensure protection of human rights. Human rights are in place to guard against the risks of misuse and mishandling as well as providing effective remedy.

2.7 The UK Government introduced a bill ('Bill of Rights Bill') to the UK Parliament in mid-2022, whose aim is to replace and repeal the Human Rights Act 1998. The UK would still have remained party to the European Convention on Human Rights, the rights would still have had effect in domestic law and public authorities would still have duties to act in a way compatible with them. However, various reforms would have made it more challenging for claimants to bring cases and would alter how courts interpretation legislation and Convention rights. Overall, it seems that the Bill of Rights Bill would have weaken human rights protections in the UK. The Scottish Government's policy position as of early 2022 was to oppose the UK Government's proposed reforms, on the aforementioned grounds and also given the potential impact on the devolution settlement given compliance with the Human Rights Act 1998 is a condition of the Scottish Parliament passing legislation in the Scotland Act 1998. In September 2022, the new UK Government under Liz Truss withdrew the Bill of Rights Bill.

Data protection

2.8 One key area of legislation is privacy and data protection law. The UK is a signatory to the Council of Europe Convention No. 108 on data protection. Council of Europe Convention No. 108 for the protection of individuals with regard to the automatic processing of personal data, an international treaty on data protection. The UK also has data protection legislation in the form of the Data Protection Act 2018, which implements the most recent reforms to EU law in this area including the General Data Protection Regulation and Law Enforcement Directive, and at the time of writing is still in force. This means that currently UK law reflects EU standards in data protection (and is known as the 'UK GDPR').

2.9 Law enforcement authorities in Scotland are subject to UK data protection law which incorporates the UK GDPR and the Data Protection Act 2018 (DPA 2018).

2.10 Which data protection regime applies depends upon the purpose of the processing and the nature of the body that is carrying out the processing. Part 3 of the Data Protection Act 2018 applies specifically to competent authorities (or their processors) processing for criminal law enforcement purposes. The legislation defines a competent authority as:

  • a person specified in Schedule 7 of the DPA 2018; or
  • any other person if, and to the extent that, they have statutory functions to exercise public authority or public powers for the law enforcement purposes.

2.11 The Chief Constable of the Police Service of Scotland, Procurator Fiscals and the Crown Agent are specified in Schedule 7. Other Scottish Policing bodies are identified as competent authorities by virtue of their statutory functions.

2.12 Law enforcement purposes are defined under section 31 of the DPA 2018 as:

  • 'The prevention, investigation detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security.'

2.13 Part 3, Chapter 2 of the DPA 2018 sets out the main responsibilities for competent authorities processing personal data for law enforcement purposes.

  • All processing of personal data for law enforcement purposes should comply with the six data protection principles and must be:
  • lawful and fair (first principle);
  • collected for specified, explicit and legitimate purposes, and not processed in a manner that is incompatible with the purpose for which it was originally collected (second principle);
  • adequate, relevant and not excessive in relation to the purpose for which it is processed (third principle);
  • accurate and, where necessary, kept up to date, and every reasonable step must be taken to ensure that personal data that is inaccurate, having regard to the law enforcement purpose for which it is processed, is erased or rectified without delay (fourth principle);
  • Personal data processed for any of the law enforcement purposes must be kept for no longer than is necessary for the purpose for which it is processed (fifth principle);
  • Personal data processed for any of the law enforcement purposes must be processed in a manner that ensures appropriate security of the personal data, using appropriate technical or organisational measures (and, in this principle, "appropriate security" includes protection against unauthorised or unlawful processing and against accidental loss, destruction or damage) (sixth principle).

2.14 Where Information Commissioner's Office list of competent authorities are processing personal data for general purposes (e.g. safeguarding), the UK GDPR applies. Data protection law is regulated by the Information Commissioner's Office (ICO).

2.15 In June 2022, the UK Government published its UK Government response to the UK Government 'Data - a new direction' consultation which it launched in September 2021, setting out the UK Government's intention to reform data protection law post-Brexit. In July 2022 the UK Government introduced the Data Protection and Digital Information Bill to the UK Parliament, containing various proposed reforms to the current data protection legislative framework. Of note from the policing and law enforcement perspective were the plans to remove: the requirement incumbent on police and law enforcement to log a justification for accessing specific data records; the requirement that individuals must be informed that they have been subject to automated decision-making; and the Biometrics and Surveillance Camera Commissioners and the Surveillance Camera Code in England and Wales. The Bill would also have extended by two months the interval that law enforcement agencies have to respond to access requests. However, a second reading of the Bill in September 2022 in the House of Commons was withdrawn and at the time of writing has not been rescheduled so the Bill's progress is unclear.

Sensitive data

2.16 Information Commissioner's Office clarification of sensitive processing is defined in section 35(8) of the DPA 2018 as:

(a) the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs or trade union membership;

(b) the processing of genetic data, or of biometric data, for the purpose of uniquely identifying an individual;

(c) the processing of data concerning health;

(d) the processing of data concerning an individual's sex life or sexual orientation.

2.17 When undertaking 'sensitive processing', in order to comply with the first principle (lawful and fair) the processing must be:

  • based on the consent of the data subject; or
  • strictly necessary for the law enforcement purpose and based on a Schedule 8 DPA 2018 condition.

2.18 In practice it is very difficult to obtain valid consent for processing personal data in a law enforcement context because of the high standards required for valid consent under data protection law. This means that in most instances competent authorities processing sensitive data must be able to demonstrate that the processing is strictly necessary and be able to satisfy one of the conditions in Schedule 8 of the DPA 2018. Strictly necessary in this context means that the processing has to relate to a pressing social need and that it cannot reasonably be achieved through less intrusive means. Competent authorities will also need to ensure that there is an appropriate policy document in place. The conditions for sensitive processing in Schedule 8 of the Act are described on the ICO website.

Automated decision making

2.19 A data subject has the right not to be subject to a decision that is:

  • based solely on automated processing; and
  • produces an adverse legal effect or significantly affects the individual;

2.20 Unless that decision is required or authorised by law.

2.21 Section 50 of the DPA 2018 sets out the legal obligations placed on competent authorities when using automated decision making. These include making sure that individuals are able to:

  • obtain human intervention;
  • express their point of view; and
  • obtain an explanation of the decision and challenge it (the obligation is to inform the data subject in writing that they have been subject to a decision based solely on automated processing).

2.22 As mentioned earlier, the Data Reform and Digital Information Bill proposes to remove the requirement that an individual is informed about automated decision making being used vis-a-vis them in the policing context.

Data protection impact assessment

2.23 Controllers must carry out a data protection impact assessment (ICO definition of (DPIA) before they process personal data when the processing is likely to result in a high risk to the rights and freedoms of individuals.

2.24 Processing that is likely to result in a high risk includes (but is not limited to): systematic and extensive processing activities, including profiling and where decisions that have legal effects, or similarly significant effects, on individuals; large scale processing of special categories of data or personal data relation to criminal convictions or offences; using new technologies (for example surveillance systems).

2.25 Controllers must take into account the nature, scope, context and purposes of the processing when deciding whether or not it is likely to result in a high risk to individuals' rights and freedoms.

2.26 It is good practice to carry out a DPIA for all new processing. Undertaking a data protection impact assessment (or DPIA) can help controllers identify the most effective way to comply with their data protection obligations and meet individuals' expectations of privacy. An effective DPIA will allow controllers to identify and fix problems at an early stage, reducing the associated costs and damage to reputation which might otherwise occur.

2.27 When undertaking a DPIA, the ICO recommends that controllers refer to its Overview of Data Protection Harms and the ICO's Taxonomy to help them identify possible harms that may arise from plans that are being considered.

Data protection by design and default

2.28 Under the UK GDPR and Part 3 of the Data Protection Act (Section 57), controllers have a general obligation to implement appropriate technical and organisational measures to show that they have considered and integrated the principles of data protection into its processing activities.

2.29 If a controller is processing personal data for law enforcement purposes it must implement these measures by default to ensure that it only process personal data for a specified and necessary purpose.

2.30 It must also ensure that by default it has put safeguards in place to prevent personal data being made available to an indefinite number of people without an individual's intervention. The ICO has published guidance on privacy by design and default within the Guide to the UK GDPR.

Artificial intelligence guidance

2.31 The ICO has guidance on AI and data protection that it recommends controllers take into account in formulating plans to process and actual processing of personal information that involves AI. This guidance is best practice for data protection-compliant AI, as well setting out how it interprets data protection law as it applies to AI systems that process personal data. It contains advice on how to interpret relevant law as it applies to AI and recommendations on good practice for organisational and technical measures to mitigate the risks to individuals that AI may cause or exacerbate.

2.32 In cases where a controller is both using AI and undertaking data analytics, the ICO recommends that it consult its Toolkit for organisations considering using data analytics. The toolkit is most helpful to controllers at the beginning of any data analytics project lifecycle. It will help them to recognise some of the central risks to the rights and freedoms of individuals created by the use of data analytics.

Equality Act 2010

2.33 The Equality Act 2010 (EA 2010) protects individuals from discrimination and supports progress on equality. The Equality and Human Rights Commission has published guidance on the EA 2010.

Non-discrimination

2.34 The EA 2010 provides protection from discrimination, victimisation and harassment because of a protected characteristic. There are nine protected characteristics – age, disability, gender reassignment, marriage and civil partnership, pregnancy and maternity, race, religion or belief, sex and sexual orientation. The EA 2010 prohibits:

  • Direct discrimination
  • Indirect discrimination
  • Discrimination arising from disability
  • Failure to make reasonable adjustments for disabled people
  • Harassment
  • Victimisation

The public sector equality duty (PSED)

2.35 The PSED is made up of the general duty and specific duties. The general duty requires public bodies to have due regard to the need to eliminate discrimination, advance equality of opportunity and foster good relations between different groups when they are carrying out their activities (see section 149 of EO 2010). The broad purpose of the general duty is to integrate equality considerations into the day-to-day business of public bodies. Not ensuring consideration of equality can lead to unintentional unlawful discrimination, greater inequality and worse outcomes for particular groups of people in our communities. For these reasons, the general duty requires public bodies to consider how they can positively contribute to the advancement of equality and good relations. It requires equality considerations to be built into the design of policies and practices and the delivery of services, and for these to be kept under review. The Scotland-specific equality duties contained in the Equality Act 2010 (Specific Duties) (Scotland) Regulations 2012 (as amended) help listed public bodies meet the general duty. The Equality and Human Rights Commission has also issued guidance on the PSED for Scottish public bodies.

2.36 Scottish Ministers, Police Scotland and the Scottish Police Authority have legal obligations under the PSED as service providers and employers. Of particular relevance when considering the adoption and application of new technologies is the specific duty requirement to assess the equality impact of proposed and revised policies and practices (see regulation 5 of the Equality Act 2010 (Specific Duties) (Scotland) Regulations 2012 as amended). There is guidance for public bodies on how to carry out an equality impact assessment.

2.37 At the earliest stage of the development of a proposed policy or the revision of an existing policy, public bodies should:

  • Identify if, and how, the duty applies
  • Collect equality evidence
  • Assess the potential impact by considering whether the equality evidence indicates potential differential impact on each protected characteristic group or provides an opportunity to improve equality in an area, by asking:
    • Does the proposed policy eliminate discrimination?
    • Does the proposed policy contribute to advancing equality of opportunity?
    • Does the proposed policy affect good relations?
  • Take account of the results of the assessment in developing the proposal
  • Ensure decision makers have due regard to the results of the assessment when making the final decision about the policy and its implementation
  • Document decisions and how due regard formed part of that decision
  • Publish results of the assessment
  • Monitor the actual impact of the policy

2.38 Also of relevance when considering the adoption of new technology is the specific duty requirement to consider the use of equality award criteria and conditions in relation to public procurement. The EHRC has procurement guidance for Scottish public authorities in order to assist with compliance with this duty.

2.39 On a practical level, Police Scotland need to make sure they have the systems and processes in place to:

  • gather and use the equality data of employees in order to meet the requirements of the Equality Act 2010 (Specific Duties) (Scotland) Regulations 2012 (as amended)
  • collate and use the equality data of service users as a means of demonstrating compliance with section 149 of EO10

Biometrics

2.40 Biometric data is personal data that is obtained through specific processing relating to physical, physiological or behavioural characteristics of a person. Biometric data is processed "for the purpose of uniquely identifying a natural person" is sensitive data under the DPA 2018.

2.41 There are specific legal regimes related to data protection for biometrics data in the criminal justice system in Scotland. The Scottish Biometrics Commissioner Act 2020 defines biometrics data and set up an independent public body for promoting and support the legal, ethical and effective acquisition, retention, use, and destruction of biometric data for criminal justice and policing purposes in Scotland applies to Police Scotland, the Scottish Police Authority (SPA), and the Police Investigations and Review Commissioner (PIRC). The use of biometrics is supplemented by other legal frameworks, including:

  • Equality Act 2010 (Specific Duties) (Scotland) Regulations 2012
  • Part 2 of the Criminal Procedure (Scotland) Act 1995
  • Section 56 of the Criminal Justice (Scotland) Act 2003
  • Chapter 4 of Part 4 of the Age of Criminal Responsibility (Scotland) Act 2019
  • Police and SPA Codes of Practice

2.42 It is important to note however that the definitions of biometric data under data protection law and under the Scottish Biometrics Commissioner Act 2020 (SBC Act) are slightly different. The definition under the SBC Act is broader and includes photographs.

2.43 The SBC has a statutory duty to prepare, and from time to time revise, a code of practice on the acquisition, retention, use and destruction of biometric data for criminal justice and police purposes. The Code of Practice will apply to Scottish legislation which permits the capture of biometric data in Scotland by Police Scotland, the SPA or PIRC without consent, except where that data is collected under legislation reserved to the UK Parliament, and where it already falls within the independent oversight of another commissioner. In May 2022, the draft Code of Practice was laid before the Scottish Parliament for comments as required by the SBCA. The Scottish Government will prepare draft legislation on the Code of Practice by the end of 2022. The Code of Practice provides a high-level summary of the 12 Guiding Principles and Ethical Considerations when acquiring, retaining, using, or destroying biometric data for criminal justice and policing purposes in Scotland, Police Scotland, the SPA and PIRC must adhere to the following 12 General Guiding Principles and Ethical Considerations:

  • Lawful Authority and Legal Basis
  • Necessity
  • Proportionality
  • Enhance public safety and public good
  • Ethical behaviour
  • Respect for the human-rights of individuals and groups
  • Justice and Accountability
  • Encourage scientific and technological advancement
  • Protection of children, young people, and vulnerable adults
  • Promoting privacy enhancing technology
  • Promote Equality
  • Retention periods authorised by law

Impacts

Human rights impacts

2.44 Human rights impacts depend on the type of technology used. The range of technologies employed also highlights that it is insufficient to assess the human rights impact of discrete technologies in isolation, but they must also be examined in context and in relation to the overall impact their use has on a particular sector, such as policing. Human rights impacts are also explored by the research commissioned by the IAG.

2.45 New technologies are used in innovative manners to help police to prevent or resolve crime. However there are some human rights concerns. The paper highlights a number of examples, including algorithms, facial recognition software and predictive policing. There is no requirement of independent quality check attached to these technologies at the moment.

2.46 In some cases it may be impossible to know the full impact of police use of technology on human rights, and the harm may be difficult to quantify, particularly as it may continue in the future. For example, when biometric data is collected it is not transparent to know what will be done later with the personal data (deleted, shared or sold). Consideration should also be taken into account of the impacts and threats of technology use at multiple levels: some technologies and their applications may impact more on the individual, community and society-wide levels.

2.47 In practice, governments often rely on private contractors to design and develop new technologies in a public context. This is also true of the police. Private actors should comply with all applicable laws and respect human rights. We have a collective responsibility to give direction to these technologies so that we maximize benefits and curtail unintended consequences and malicious uses.

2.48 Discrimination can result from the design and development of digital technologies. Al and machine learning systems are often dependent on historic data, which may be incomplete or contain bias. The result is a biased technology as such discrimination may then be reproduced and amplified when used by the police.

2.49 The regulation and governance of the design and development of new technologies is therefore critical to create the conditions for innovation and to ensure that these technologies, particularly AI are used to advance, rather than put at risk, equality and human rights. Understanding the multi-level impacts of new technologies, as mentioned, is key, as some of the risks may occur at a more societal level, especially to freedom of association or assembly where a chilling effect may be produced.

2.50 Indeed, risks to democratic freedoms (impacting Articles 9 – 11 of the ECHR) can arise from the widespread use of surveillance tools and AI-enabled technologies by police. There is an increased use of digital surveillance tools in the context of peaceful assembly and freedom of expression under the auspices of national security or public order. This type of interference with our democratic freedoms should only be permitted if it is lawful, proportionate and necessary on a targeted basis where reasonable suspicion can be demonstrated. The proportionality principle requires that any surveillance measure used should be the least invasive option. UK surveillance laws including the Investigatory Powers regimes applicable to certain bulk surveillance practices, must respect these principles. Surveillance practices, bulk data collection and facial recognition technologies employed at large events therefore raise human rights (proportionality) concerns as well as being potentially discriminatory. This was confirmed by the ECtHR in the Big Brother Watch v. the UK and Centrum för Rättvisa v. Sweden cases regarding bulk surveillance. The issues around facial recognition have been considered in the UK context by the ICO in its Opinion on The use of live facial recognition technology in public places. Furthermore, the Bridges/South Wales Police case also sheds further light on the issue (discussed below).

2.51 The principles of equality and non-discrimination are central to human rights law. As discussed, discrimination can be reinforced by AI. It is important that police do not use broad profiles that reflect unexamined generalisations and/or stigmatisation. For example, the use of live facial recognition technology poses a risk not only to the enjoyment of the right to peaceful assembly but also reinforces discrimination. Those who are particularly at risk of discrimination by this technology include African descendants and other minorities, women and persons with disabilities. For example, there is ample literature on the algorithmic error rate in facial recognition technologies, leading to minority individuals being wrongly flagged leading to detention.

2.52 Currently in Scotland there is a moratorium on police use of live facial recognition technology, which contrasts with the situation in England and Wales, where live facial recognition technology has been deployed by police forces in public places, often in controversial ways and settings. Indeed, a specific use of public space facial recognition surveillance by the police in other UK jurisdictions (England and Wales) has deemed unlawful in the Bridges v South Wales Police case from 2020 - although this only declared that particular use of live facial recognition illegal, rather than all live facial recognition use by police. However, non-live facial recognition is used by Police Scotland, which may still exhibit discriminatory biases. Furthermore, live facial recognition has been used by other public agencies in Scotland, such as schools, in controversial ways.

2.53 For biometric data, steps are being taken to clarify the legal frameworks governing its use. The SBC draft Code of Practice, once assented by the Scottish Parliament, will become the first of its kind and Scotland will become the first UK nation to have detailed legislation, a statutory Code of Practice on the acquisition, retention, use and destruction of biometric data for criminal justice and police purposes.

Data protection impacts – A view from Police Scotland

Since 2018, PS has worked to make the use of DPIAs systemic within PS for all new or updated processing, which includes but is not limited to the introduction or use of new technologies. Over 140 DPIAs have been approved since 2018 with more in progress. To underline the value and importance of the DPIA framework:

  • PS project 'stage gates' include DPIAs as a mandatory requirement.
  • PPAs (Pre Project Assessments) are sent for consultation to a range of business areas, including CDO. This allows for early and indicative comments to be provided that will guide PS on the high-level challenges that may be faced by a project.
  • Engagement and collaboration with Strategy and Innovation, including regular consultation at the CDO IA team's DPIA meeting. This allows potential technologies to be discussed ahead of any formal documentary submission and consideration given to early steps to be taken.

Police Scotland consider that their impact assessment procedures are aligned in ways which ensure ethical and legally compliant outcomes: that the questions in, and outcomes from a Police Scotland DPIA and EQHRIA are in concert with each other. For example, a DPIA is unlikely to be approved if an EQHRIA identifies negative impacts of technology on a particular section of society. In such examples, the DPIA and EQHRIA frameworks are used to inform and guide the design of processes and processing.

'Adjustment' of technology by police to ensure compliance with legislation can result in a range of different actions. The approach taken by PS is to use the framework of questions in a DPIA to guide the design, build and implementation/use of technologies and processing and therefore, information specialists, SMEs and operational leads work collaboratively to discuss risks and identify solutions. In practical terms this means that a DPIA may go through many draft versions and identify risks for which a project must determine and implement an action plan to allow a risk to be mitigated to the extent that a DPIA can be formally agreed by a Strategic Information Asset Owner (SIAO).

To give the IAG - and wider public - assurance that this process works in practice, Police Scotland gave us a list of examples of changes made to technologies during the proposal/design/development phase in a DPIA framework to ensure compliance:

  • The re-design of technical capabilities (common throughout ICT projects)
  • PS did not purchase/implement all technological capabilities (cyber kiosks)
  • PS agreed not to use technology for a particular purpose (Telematics, LBM) or only after certain authorisations are given and provided privacy notices to that effect
  • Contractual specifications (no processing outside of the UK is permitted)
  • Additional security controls or configurations
  • Testing & POC on dummy data

2.54 Police Scotland may wish to consider including undertaking Children's Rights and Wellbeing Impact Assessments (CRWIAs) alongside DPIAs and EQHRIAs to inform and guide the design of processes and processing as a way of further embedding rights and enhancing the human rights approach of Police Scotland.

Equality impacts

2.55 Police forces may use technology to identify where specific crimes may occur, crime solvability, and who may commit crimes. These technologies are based on predictive analytics which leverages data, algorithms, and other technologies (e.g. facial recognition technology) to monitor and assess individuals, communities and/or specific locations. This type of policing is particularly challenging as it can target particular protected characteristic groups over others, including racial groups, younger people, disabled people, religious groups and women. Both EHRC and SHRC raised concerns in 2020 about potential discrimination caused by predictive policing.

2.56 Police Scotland must harness their EHRIA process to further help them identify potential discrimination and identify opportunities to promote equality when designing, commissioning or using new technologies. From an equality perspective, this means considering the possible impacts the technology can have on people with protected characteristics, thinking about any relevant inequality they experience, barriers or specific needs.

Summary of section

2.57 Police use of technology operates within and spans many spheres of legislation. Frameworks related to technology enabled policing in Scotland could implicate human rights through the Human Rights Act 1998, or issues related to privacy and data protection as per the Data Protection Act 2018 (in particular Part 3 of the Act which applies to authorities processing data for law enforcement purposes, and data protection principles more generally). Furthermore, there are statutory equality duties enshrined within the Equality Act 2010 which have been found to be applicable when considering the impact of technology on Scotland's diverse communities. Mechanisms to address and manage the associated concerns around legal issues and impacts of technology were found to range from legislative guidance, to toolkits produced by organisations, and a range of impact assessments.

2.58 Technological capabilities and associated legal frameworks can both be observed to be always evolving. Sometimes both phenomena are evolving in tandem as illustrated by the Biometric Commissioner Act 2020; and sometimes there may be friction or tension between the developments. What is clear is that neither technology nor legislation exist in a static state. This requires the recurring and iterative need to reconsider and evaluate the impacts of technology on individuals, communities, and society.

Contact

Email: ryan.paterson@gov.scot

Back to top