Information

Scottish Parliament election: 7 May. This site won't be routinely updated during the pre-election period.

Artificial Intelligence (AI) in school education - guidelines and guardrails: child rights and wellbeing impact assessment

Child rights and wellbeing impact assessment (CRWIA) for the introduction of guidelines and guardrails on the use of Artificial Intelligence (AI) in school education.


Child Rights and Wellbeing Impact Assessment Template

1. Brief Summary

Type of proposal:

Decision of a strategic nature relating to the rights and wellbeing of children

Name the proposal, and describe its overall aims and intended purpose.

The Guidelines and Guardrails for the use of AI in School Education [‘The guidelines and guardrails’] comprise, core principles (guardrails); key definitions in relation to Artificial Intelligence (AI); Frequently Asked Questions (FAQs) covering a broad range of questions in relation to the use of AI in education; a checklist and examples of both acceptable and unacceptable uses of AI within a school setting for both teachers and learners, which include using AI to support teachers in generating differentiated learning resources and supporting pupils to generate practice questions and quizzes to aid subject revision. Finally, this resource also contains links to relevant, quality assured resources which provide further, relevant information on AI.

The guidelines and guardrails aim to support those delivering education in schools, including teachers, school leaders and education authorities in considering if and when the use of AI is appropriate in schools, and how best to use AI ethically to support practice. This will also be of interest to parents and carers, as well as children and young people themselves.

The guidelines and guardrails do not represent statutory guidance, but instead provide overarching principles relating to the ethical use of AI in education. Under the Education (Scotland) Act 1980, the delivery of school education is the responsibility of local authorities and alongside schools, they can make decisions on how education is delivered to meet their specific local context. We are aware that there is currently varying use of AI in schools across local authorities. The guidelines and guardrails is therefore designed to assist both local authorities and those working in schools in making decisions in relation the use of AI in their own school contexts.

Start date of proposal’s development: June 2025

Start date of CRWIA process: November 2025

2. With reference given to the requirements of the UNCRC (Incorporation) (Scotland) Act 2024 which aspects of the proposal are relevant to/impact upon children’s rights?

The guidelines and guardrails aim to support those involved in the delivery of school education in considering when the use of AI is appropriate and how best to safely and ethically use AI to support the delivery of education in schools. The guidelines and guardrails is intended to support AI to be used safely and ethically in Scottish schools and therefore strengthen children’s rights.

Articles with a positive impact: 2, 3, 5, 12, 13, 16, 17, 19, 23, 28, 29, 36

In the context of the guidelines and guardrails, the articles relate to: ensuring the safety and privacy of children and young people are safeguarded when schools consider the use of AI in the delivery of education; ensuring that the use of AI in schools recognises and addresses equity and fairness considerations in respect of children and young people and ensuring that young people have the opportunity to learn about, understand, and use AI within their entitlement to develop skills for learning, life and work as part of the 3-18 Curriculum.

Article 2 (non-discrimination)

The ‘core principles of AI in education’ section of the guidelines and guardrails clearly states that the use of AI must be underpinned by equity and fairness and provides further detail of what this means in relation to education. Schools must recognise and address the fact that children and young people’s access to digital learning tools, including AI, is variable and may be limited by a range of factors. The guidelines and guardrails is also clear that where AI is used to support a human-centred approach to teaching and learning, it must do so in a way that values diversity of children and young people and enhances different ways of learning and differentiated pedagogical approaches. Additionally, it clearly states that where AI is used in school learning, including homework, schools should aim to ensure equitable outcomes for children and young people, including ensuring that all children and young people have the means to participate equally.

Article 3 (the best interests of the child)

The guidelines and guardrails will ensure that where AI is use in school it is done so with the best interests of the child. The guidelines and guardrails emphasise that the best interests of the child must be a primary consideration in decisions about whether and how to implement AI in school education. The guardrails section explicitly state that AI use must ensure the safety and privacy of children and young people by prioritising children’s rights, ethical standards and data protection. Alongside protecting against the negative impacts of the use of AI in education, the guidelines and guardrails also aim to support teachers to use AI in a way that better supports the delivery of high quality learning and teaching, including by providing examples of effective AI uses in the classroom. The document therefore has a positive impact on Article 3.

Article 5 (parental guidance)

The guidelines and guardrails explicitly state that parents and carers should be made aware of how AI is being deployed within their children’s schools. The guardrails section of the document also states that schools and local authorities’ digital technology policies should be updated to reflect the ethical use of AI and this should be carried out in consultation with a range of stakeholders, including parents and carers. The exemplification section of the document also highlights that those deploying AI in schools should consider whether information about the tools being used is available to parents and carers. The guidelines and guardrails therefore promote the sharing of information between schools and parents and carers on the use of AI in their children’s education.

Article 12 (respect for children’s views)

The importance of ensuring that children and young people’s views in relation to the use of AI are heard and respected is embedded throughout the guidelines and guardrails. This includes explicitly stating that children and young people’s views should be sought and considered on the use of AI in their learning and that the views of children and young people should be considered when local authorities and schools update their digital technology policies. Information on the use of AI in education should be available for children and young people in age appropriate language to support a full understanding.

Article 13 (freedom of expression)

The guidelines and guardrails state that children and young people should have the opportunity learn about, understand and use AI as part of their school curriculum this therefore strengthens their right to freedom of expression. AI presents significant opportunities for users to gain access more readily to large quantities of information. The guardrails are seeking to ensure that children and young people are equipped with the knowledge, skills and opportunities to learn about and use AI in a safe and informed manner and they should be provided with support to interrogate and scrutinise the information that AI generates.

Article 16 (privacy)

The first principle listed in the guidelines and guardrails states that the use of AI within schools must ensure the safety and privacy of children and young people and there is repeated emphasis throughout the document on the importance of data protection and privacy. The guidelines and guardrails explicitly states that AI tools must comply with Data Protection Law and all AI users should be supported to understand how to protect and safeguard pupils’ sensitive and personal information.

Article 17 (access to information from the media)

The guidelines and guardrails emphasise that children and young people should have the opportunity to learn about, understand and use AI in appropriate contexts. With the growing integration of AI within electronic media, this will help ensure that children and young people are equipped with the tools to identify and mitigate against false information in their news sources. Further, the guidelines and guardrails seek to ensure that AI is used ethically and safely within schools, therefore reducing risks to children and young people of being exposed to harmful media generated by AI.

Article 19 (violence, abuse and neglect)

By promoting the ethical use of AI in schools by teachers and local authorities and including a strong focus on safeguarding children’s privacy and data protection, the guidelines and guardrails seek to further protect children and young people particularly with respect to exposure to violent images and exploitation.

Article 23 (disabled children)

The guardrails section emphasises that the use of AI must be underpinned by equity and fairness. Where deployed effectively, some AI enabled technologies have the potential to help overcome barriers to learning for some children and young people, including those with disabilities. However, we also know there are inherent risks and biases built within AI systems. These guidelines and guardrails will help support ethical and safe use of AI in schools and should in turn encourage safe use of these tools.

Article 28 (education) and Article 29 (right to education and purposes of education)

The guidelines and guardrails explicitly state that the use of AI in schools must support the aims of the 3-18 curriculum and that within the curriculum offering, children and young people should have opportunities to learn about, understand and use AI in appropriate contexts. The guidelines and guardrails support children and young people to be prepared for a world in which AI is embedded within society. By ensuring that children and young people are supported to develop a basic understanding of how AI works, they will in turn be able to better understand the impacts, risks and opportunities that the use of AI in their education presents.

Article 36 (other forms of exploitation)

The guardrails emphasises the importance of protecting the privacy and data of children and young people when AI is used in education settings by teachers and others involved in the delivery of school education. The guidelines and guardrails also explicitly state that children and young people should have the opportunity to learn about, understand and use AI in appropriate context, helping to increase awareness of the risks that AI poses. By doing so, implementation of this guidance should have a positive impact on the protection of children and young people from exploitation.

Articles with a neutral impact: 1; 6; 7; 8; 9; 10; 11; 14; 15; 18; 20; 21; 22; 24; 25; 26; 27; 30; 31; 32; 33; 34; 35; 36; 37; 38; 39; 40; 41; 42

Articles with a negative impact: We do not anticipate that the guardrails and guidance have the potential to negatively impact on any of the UNCRC requirements.

3. Please provide a summary of the evidence gathered which will be used to inform your decision-making and the content of the proposal

Evidence from:

Existing research

  • Children’s Parliament, AI Alliance and The Alan Turing Institute, ‘Exploring Children’s Rights and AI’ [2024]. This research engaged with 87 children aged between 7 and 11 years old from four schools across Scotland (Edinburgh, Glasgow, Shetland and Stirling).
  • ADES and Staff College, ‘Learning Beyond Boundaries’ [2024]. This research involved over 200 young people, including pupils in S1/S2 (ages 11 - 14) and S5/S6 (ages 15 – 18).
  • AHRC Bridging Responsible AI Divides (BRAID) programme, Towards Embedding Responsible AI in the School System [2024]. The zine was produced following workshops with 22 young people comprising 16-17 year olds from a secondary school in Edinburgh; 17-18 year olds from a school in Norfolk and 13-16 year olds from an Additional Support Needs (ASN) school in Edinburgh.
  • Atabey A; Sylwander K and Livingstone S, ‘A child rights audit of GenAI in EdTech: Learning from five UK case studies’ [2025] This research applies a child’s rights approach to children’s learning to evaluate five Generative AI tools used in education within the UK.

Feedback directly from children and young people

  • Views shared by children and young people during engagement huddles, which were commissioned by the Scottish Government and delivered by the Scottish Youth Parliament, Young Scot, Children’s Parliament and Children in Scotland on the education reform programme and current issues facing schools. A total of 70 children and young people aged between 7 and 25 participated in the third instance of these huddles, taking place in Autumn 2025 with discussions focused on qualifications and assessment. During these huddles, a wide range of areas were discussed, which included AI.

Feedback from stakeholders

  • Feedback received from the Children and Young People’s Commissioner Scotland and Children’s Parliament on a draft version of the guidelines and guardrails which was received by Scottish Government officials in November 2025.
  • Feedback received from a range of other stakeholders to a draft version of the guidelines and guardrails, including from SQA, the General Teaching Council of Scotland, academics at the University of Edinburgh, the Scottish AI Alliance, Connect, the Association of Directors of Education Scotland (ADES), COSLA and trade unions.

4. Further to the evidence described at ‘3’ have you identified any 'gaps' in evidence which may prevent determination of impact? If yes, please provide an explanation of how they will be addressed

There are no gaps in the evidence at the current time which may prevent determination of impact, however given the subject matter, there is growing evidence in relation to AI, including children and young people’s views on the place of AI in education. As evidence continues to emerge we will seek to update the guidelines and guardrails and the CRWIA.

5. Analysis of Evidence

The evidence demonstrates that there is a need for the guidance and guardrails in order to safeguard and promote children’s rights.

The evidence also demonstrates that children and young people hold a broad range of views on the use of AI in education but a common theme emerging is the need to ensure that data and privacy are respected.

The Exploring Children’s Rights and AI report reflects that the children involved (aged between 7-11) expressed optimism about the potential of AI to support education and teacher workload but strongly emphasised the irreplaceable role of human interaction for wellbeing and pastoral care. They raised concerns about AI replacing teachers and the technology failing to meet diverse learning needs, especially for neurodivergent children. They were also concerned about AI providing inappropriate information. Data privacy was raised as a key issue, with children expressing that they were uncomfortable with extensive data collection and wanted control over consent, particularly against sharing with external parties. They expressed that they valued learning about AI alongside their rights within the curriculum, viewing it as essential in keeping them safe, helping them to make informed choices, and grasping future career opportunities. Children were clear that it is the responsibility of the Government to safeguard against risks. The children involved in the study stressed the need to prevent bias in AI systems. Overall, while children saw benefits in AI as a classroom tool, they cautioned against over-reliance and highlighted inclusion, safety, and ethical considerations as priorities. These findings support the creation of guidelines, which in particular help to protect children and young people’s data and privacy, ensure that they have an opportunity to learn about AI within the curriculum offer and enable AI to be used within education in a safe and ethical way.

The Learning Beyond Boundaries report found that the children and young people involved (aged 11-18) wanted schools to provide clear, foundational education on AI, covering its uses, risks, and ethical implications, to help them navigate its growing role in daily life. Pupils expressed interest in AI tools for homework support and suggested integrating AI into lessons to tackle real-world issues like climate change. They saw value in AI being used to assist teachers with administrative tasks to free up their time for meaningful engagement with learners. The report calls for national AI guidelines to ensure consistent and responsible implementation across education.

The ‘Towards Embedding Responsible AI in the School System’ research found that the young people involved (aged 13–18) called for responsible AI integration in education, emphasising transparency, choice, and rights protection. They expressed frustration with AI systems mimicking human friendliness and creating extra workload, and voiced strong concerns about the environmental impact of generative AI, intellectual property misuse, and threats to creativity and future jobs. Participants stressed the need for clear information on AI’s benefits and risks, the ability to opt out without educational disadvantage, and safeguards for privacy and safety. Overall, they advocated for an approach that prioritises ethical use, informed consent, and sustainability in AI deployment within schools. The research findings therefore support the introduction of guidelines that promote the ethical and safe use of AI in education and the inclusion of a principle within the guidance which states that AI must support and enhance, rather than replace, human-centred teaching and learning in schools.

‘A child rights audit of GenAi in EdTech: Learning from five UK case studies’ looked at five GenAI tools currently used in education in the UK and applied a child rights approach to examine how they uphold key rights under the UNCRC. Overall, the study found that while each GenAI tool offers the potential to facilitate learning, they also present risks. The report states that the main risks relate to “opaque data practices, poor transparency, commercial exploitation… including from age-inappropriate adult websites…” and across all tools, children’s perspectives were largely excluded from their design, governance and evaluation. The risks highlighted by this research in relation to the use of GenAI in education tools reinforces the need for guidance on the ethical and safe use of AI tools in Scottish schools.

During the recent engagement between Scottish Government and children and young people (aged 7 – 25) on the education reform programme, children and young people acknowledged the potential of AI to enhance learning and make lessons more engaging. However, attendees were clear that AI cannot replace teachers and raised concerns about the use of AI in education, which included bias in AI systems; data privacy; accuracy of AI-generated information; risks linked to the use of AI in assessment and the loss of in-person learning skills. Participants highlighted the need for teacher training on AI use and regulation to mitigate negative impacts of the technology. The children and young people also stressed the importance of teaching children how to use AI responsibly and highlighted the need for a blend of digital and offline learning tools for young people with additional support needs. Overall, the children and young people supported the use of AI to bolster their education and placed a strong emphasis on ethics, safety and inclusion with respect to AI in education.

As part of the drafting process for the guardrails, organisations representing children and young people were consulted. They broadly welcomed the draft guidance, but suggested improvements to the text. In their response, Children’s Parliament emphasised the need for teacher training on AI, the importance of children having a say in classroom use of AI, more explicit reference to environmental impacts of generative AI, which are known to be a key concern for children and young people, and greater emphasis on the inclusion of the UNCRC in Scots Law throughout the guidance. The Children and Young People’s Commissioner Scotland called for clearer references to children’s rights within the guidance and raised concerns about risks presented by AI tools, including chatbots, and their suggested wider classroom use within the guidance. They stressed the need for teachers to understand emerging risks linked to AI and recommended explicit mention within the guidance of AI companies’ impact on human rights. Both organisations were broadly supportive of the introduction of guidance that encourages critical engagement with AI and helps children develop skills for safe and informed use.

Wider stakeholders within the education system were also consulted on an earlier draft of the guidelines and guardrails. Stakeholder organisations provided broadly positive feedback on the guidance and welcomed its creation, particularly the context of many schools already embedding AI within their education delivery. Individual stakeholders provided a number of suggested amendments on a range of areas, with some suggested amendments and comments relating directly to children and young people. These comments largely reinforced the main themes raised by children and young people’s organisations, particularly around being more explicit around the right of children and young people having a say in the use of AI within their own learning and highlighting the in-built risks that generative AI can pose to children and young people, despite them using tools responsibly.

6. What changes (if any) have been made to the proposal as a result of this assessment?

While the guidelines and guardrails address many of the general concerns about the use of AI in education raised by children and young people, as a result of the feedback received, we have made a number of further revisions to the document prior to publication:

  • In response to feedback from children and young people’s representative organisations to the guidelines and guardrails, we have strengthened within the document the mention of, and links to, Scotland’s incorporation of the UNCRC into our domestic law, making clear that decisions in relation to the use of AI in schools must respect children and young people’s rights.
  • We have included an FAQ entitled ‘What if a child objects to using AI in their learning?”. The answer to this question makes clear that the views of children and young people must be listened to and dealt with in line with individual school policies.
  • In response to the concerns raised by children and young people in published reports regarding the environmental impact of AI, we have included explicit reference to this within the guidance in the FAQ section. Similarly, in this section we have strengthened the importance placed by children and young people on human interaction and the role of teachers in their learning.
  • In light of feedback received by the Children and Young People’s Commissioner regarding issues concerning chatbots and in response to emerging evidence of the issues with chatbots, we have removed references to this form of technology from the guidelines and guardrails. We have also included explicit reference to teachers being supported to understand the emerging risks of AI.
  • In response to views expressed by both children and young people in the available research and the feedback from children and young people’s organisations to earlier drafts of the guidance, we’ve made it explicitly clear throughout the document that children and young people should be consulted on the use of AI in their learning and that information on digital technology policies should be made available in child friendly language. The FAQ also states that in instances where children and young people raise concerns about the use of AI in within their own learning, these should be taken seriously and dealt with in line with individual schools policies.

Contact

Email: Russell.Cockburn@gov.scot

Back to top