National discussion on education: consultation analysis

This report outlines the key findings from the ‘listening phase’ of the National Discussion on Scottish Education which took place between September and December 2022.

Appendix A: Methodology

The Lines Between was commissioned to provide a robust, independent analysis of responses to the National Discussion. The large number and varied nature of responses required a clear and systematic process for collating, coding and analysing all responses. We detail the steps in this process below.


The analysis in this report is drawn from multiple sources.

The National Discussion page on the Scottish Government's Citizen Space consultation portal received 5,388 responses[7]. Participants directly entered these into the portal, so most responses aligned with the ten core consultation questions. Participants could also upload documents to support their submission; 133 did so. Half of the supporting documents repeated the information entered into the survey, so no further action was taken. The remaining material did not align with the consultation questions or contained notes from group discussions. These were processed using the analysis methods developed for non-standard or group responses described below.

Another 196 responses were received via a specific Citizen Space portal for group discussions with children and young people, parents and families, teaching staff and various stakeholders. These were facilitated by schools, organisations and the Scottish Government's National Discussion team. Just over three-quarters of these responses (156 of 196) were entered directly into Citizen Space and aligned with the questions in the group discussion guides. The remaining 40 groups uploaded a document with their responses, and analysts added this data into the coding framework; where comments did not align with the questions, analysts used their judgement on where best to add the data.

A further 87 non-standard responses were emailed to the Scottish Government. Most of these came from organisations and were submitted in PDF, Word documents, PowerPoint presentations and images.

The social media analysis involved a review of over 1,300 original Tweets with the hashtag #TalkScottishEducation.

Numbers of participants

Throughout this report we have referenced the 232 responses we know were produced following group discussions. This figure comprises the 196 responses entered into the group discussion portal, eight provided via the National Discussion portal and 28 provided as non-standard responses.

When entering the findings from their group discussion, participants were asked to indicate how many people were in their group. Across these 232 responses, 6,884 were noted as taking part. Along with the 5,439 participants who did not leave a number, this means at least 12,323 people were represented by a response to the National Discussion.

In addition, approximately 26,000 people participated in National Discussion Live Assemblies co-ordinated by Education Scotland. Therefore, a total of at least 38,323 people were reached in the National Discussion.

Data cleaning

Data checks were carried out within and across both Citizen Space databases and non-standard responses, primarily to identify any duplicate responses. Standard checks were carried out by participant name, email address and organisation name.

However, the nature of the National Discussion meant a person could make multiple submissions – for example, a head teacher might submit their response and that of their parent council, or a school could have submitted multiple responses from different classes. All potential duplicates of this kind were reviewed but kept in the data.

As a result of the data checks, ten records were deleted; three contained no data, and seven were duplicate entries.

A further 52 participants, primarily individuals, responded to the consultation at least twice using the same email address but with different answers. We cannot know that the same person responded; for example, a parent may have allowed their child to respond using their email address. All participant comments have been analysed, but we counted each duplicate as only one participant.

Three coordinated responses were identified. This is where some participants use the same or similar wording promoted or made available by another participant, typically an organisation. These constitute a valid responses, and all have been kept in the data. Each set had seven participants with a specific interest in:

  • Catholic education
  • education for deaf young people
  • global citizenship education

While responses came from various individuals and organisations, it is unclear how many people are represented by each response. For example, a class from one school may have discussed the questions and submitted one response. In contrast, learners from another school may have shared their individual responses as part of their lesson. For analysis purposes, each submission was treated as a separate response. All views are included in this report regardless of whether a large or small number of respondents raised them.

Developing a coding framework

The team created a customised coding framework for each consultation question. This was developed in three stages.

Firstly, we reviewed a sample of responses in a team workshop and identified emerging themes. This ensured all team members began the coding process with a shared understanding of the emerging themes, developed consistent approaches to data categorisation, and contributed to the development of the framework.

To create an analysis sample, every National Discussion Citizen Space response was allocated a number between one and twenty based on the order they were submitted. This produced a set of twenty randomly selected sub-samples of 270 responses from the sample of 5,388, each of which broadly matched the overall respondent profile and contained 250 responses from individuals and 20 from organisations. A draft coding framework was created for each question based on an initial read of a sample of responses. To develop the coding framework for Q1, 271 comments from sub-sample one were reviewed; for Q2, 271 comments from sub-sample two were reviewed, and so on.

The draft coding framework was then tested in a pilot coding exercise. This involved coding a different sub-sample per question to test the framework's validity. For Q1, 270 comments from sub-sample eleven were coded; for Q2, 270 comments from sub-sample twelve were coded, and so on. The methodology used in the pilot exercise allowed us to:

  • consider one answer from every respondent across the development of the coding framework and the pilot coding
  • confirm a high degree of consistency in comments across the two sub-samples used for each question
  • identify themes which appear across all questions, and apply these consistently across the coding framework at all questions, along with the question-specific themes which emerged

Finally, new codes were created through an iterative coding process if additional themes emerged as the entire data set was processed.

Reflecting the high level of engagement in the National Discussion and the broad nature of the questions, many themes emerged within and across responses. Codes for cross-cutting themes were developed and included in the framework across all questions. This ensured that all data on these themes was captured consistently.

The final framework included c.80 common codes which spanned all questions, and a further c.10-20 specific codes for each question.


Every response was coded against the coding framework for the relevant question. If a comment covered multiple themes, it was coded against all pertinent themes in the framework.

The group discussion guide included ten questions used in the National Discussion. Analysis of a sample of group discussions identified that the themes strongly aligned with those identified in the main sample, and these responses were coded with the framework used for the main sample of 5,388 responses. In addition, bespoke code frames were created for the other four group discussion questions.

Non-standard responses

Some responses to the National Discussion, and attachments to Citizen Space responses, were provided in a range of formats. As well as PDF and Word documents, participants submitted PowerPoint presentations, Jamboards, Padlet posts and photographs of mind maps, drawings and posters created by children and young people.

Where the information in these non-standard responses aligned with specific questions, analysts added the data to the coding database and coded against the relevant themes.

Approximately 50 responses did not directly address any consultation questions. To ensure this content was captured in the analysis process, it was coded in a separate database against the c.80 themes evident across the National Discussion.

Social media

Analysts used the Tweetbinder tool to download a database of 7,745 tweets which included the hashtag #TalkScottishEducation in the six months up to 13 January 2023.

Of these, 6,140 were retweets, leaving 1,605 original tweets for analysis. Within this, 249 were original tweets from Professor Campbell, Professor Harris, or from Scottish Government accounts; they were typically comments to thank or encourage participants and are not included in the analysis. A sample of 1,356 relevant original tweets remained for analysis.

Analysts reviewed all 1,356 original tweets to determine if the comments pertained to the questions in the National Discussion. Almost three-quarters did not contain relevant information; they mainly encouraged participation in the National Discussion or thanked people for participating. The remaining quarter had information relevant to the analysis. These tweets were coded in a separate database, using the cross-cutting themes framework developed to analyse responses to the National Discussion.


A public consultation of this kind means anyone can express their views; individuals and organisations interested in the topic are more likely to respond than others. This self-selection means the views of participants do not necessarily represent the views of the entire population.

Reflecting the large number of participants in the National Discussion, it is impossible to detail every response in this report; some participants shared lengthy submissions reflecting their specific area of interest or expertise. Full responses to the consultation, where permission for publication was granted, can be found on the Scottish Government's website.

Where appropriate, quotes from a range of participants are included to illustrate key points and provide useful examples, insights and contextual information[8].

Qualitative analysis

The main purpose of consultation analysis is not to identify how many people held a particular view, but to understand the full range of views expressed. This means an insightful view expressed by a very small number of participants is not given less weight than more general comments shared by a majority.

This report lists the themes identified in responses from most to least commonly mentioned. Qualitative analysis of open-ended questions does not permit the quantification of results. However, to assist the reader in interpreting the findings, a framework is used to convey the most to least commonly identified themes in responses to each question:

  • 'many participants' is used to denote a prevalent theme mentioned by more than one in five participants
  • 'several participants'; a recurring theme raised by between one in 10 and one in five
  • 'some participants'; another theme mentioned by fewer than one in 10
  • 'a few / a small number'; fewer than one in 20, a less commonly mentioned theme

Sub-group analysis

Responses to each question were reviewed to identify if themes were more or less prevalent between groups of participants. The primary variable in this analysis was the classification participants selected in the consultation survey, i.e. learner, parent, or teacher/education practitioner/school support staff. Differences between these groups are summarised in Chapter 6 and detailed for each question in Appendices C to L (see supporting documents).

The data was also analysed for any divergence in views between individual and organisational responses and between protected characteristics including sex, ethnic background, disability or long-term condition, sexual orientation and gender reassignment, and religion.

Equalities considerations.

Chapter 5 presents comments drawn from across the National Discussion about the impact of reform on those with protected characteristics and other equalities issues.



Back to top