Behaviour in Scottish schools: research report 2023

This report is the fifth (2023) wave of the Behaviour in Scottish Schools Research, first undertaken in 2006.

Chapter 3 – Methodology

The research comprised a quantitative survey with headteachers, teachers and support staff and a programme of qualitative research with school staff and local authority representatives. The survey provided data on the frequency of different behaviours in schools and allows changes over time to be tracked. The qualitative research explored staff experiences in depth to add context and aid understanding of the survey findings. The qualitative research was also able to explore areas of relationships and behaviour in schools, and the impact of these, which were not captured by the survey.

Quantitative survey of headteachers, teachers and support staff

Questionnaire development

The questionnaire was largely based on the version used in the previous survey in 2016, with key measures of behaviour having remained since 2006 and retaining questions introduced in 2016. Two new questions were added to assess staff perceptions of the impact of the COVID-19 pandemic on pupil behaviour within the classroom and around the school. A small number of other questions and response categories were updated.


A pilot was undertaken to test new and amended questions and assess the ease of the process. Schools were required to randomly select and invite staff to participate and assess the accessibility of the survey to staff. The pilot took place between 10 and 20 January 2023. Support staff, teachers and headteachers from two primary schools and one secondary school took part. Feedback and questionnaire data suggested that the new questions were well-received and were capturing what they aimed to measure. Adaptions were made to the survey layout on the web to improve accessibility and some of the questions, such as the list of school subjects that staff teach, were updated in response to feedback. Survey information letters were also updated to advise that the survey could be completed on a smartphone but would take longer to complete. The final version of the online script and paper version of the support staff questionnaire are provided in Annex A and B.

Some changes were also made to the process of administering the survey in response to feedback. This included enabling the Key Contact in each school (the staff member in charge of inviting staff to take part) to email invitation letters to staff rather than having to hand them out. Feedback suggested that making this change would greatly increase response, especially in larger schools where it would be time consuming to hand these out to all the selected staff in person. This change required some adaptation to associated processes related to how staff accessed the survey.

Survey mode

As in 2016, the survey was conducted online with respondents having the option to complete the questionnaire on a device (PC, laptop, tablet or smartphone[29]) at school, at home or elsewhere. Sampled staff were provided with a web link to access the survey. Once the survey had been started, participants were given an access code that could be used to re-enter their questionnaire should they get interrupted.

Based on previous waves of the survey and feedback from the pilot, it was considered necessary to provide a paper version of the questionnaire as an option for support staff where they didn’t have easy and confidential access to a school computer within their normal working day. Given the clear advantages of web completion, including ease/speed, higher quality data[30], the cost saving and environmental benefits, online participation in the survey was encouraged for support staff where possible (and where privacy could be maintained). However, it was considered important, for maximizing response, to allow support staff the option to complete the survey online or on paper, depending on what was most convenient.

To further encourage response, especially at a time of industrial action among schoolteachers and staff, efforts were made to publicise the study and encourage participation through members of SAGRABIS (Scottish Advisory Group on Behaviour in Scotland), including COSLA (Convention of Scottish Local Authorities), ADES (The Association of Directors of Education in Scotland) and the main teaching unions.

Sampling and recruitment

All publicly funded, mainstream schools in Scotland were included in the sampling frame[31]. To achieve the required number of secondary school staff participating, all eligible secondary schools were sampled and invited to participate, resulting in an issued sample of 330 schools.

508 primary schools (out of a total of 2000) were sampled and invited to participate. To ensure that the selected schools were representative, a stratified random sampling approach was used. Stratification was by size of school, urban/rural category and the proportion of the school roll living in the 20% most deprived areas of Scotland[32].

The headteacher was invited to participate in all sampled schools. Teachers and support staff were sampled in proportion to the number of teachers in the school[33].

Headteachers were sent an advance letter informing them about the survey and encouraging them to take part. They were asked to complete a brief online form to confirm if they were willing to participate and to provide contact details of the member of staff in their school they wished to nominate as a key contact for the study. Any that did not complete the form were contacted by ScotCen telephone interviewers to confirm whether they were willing to participate and to obtain details of their key contact. A relatively small number of schools (n=11) opted out of participating in the survey at this stage. The key contact was then sent full instructions on how to randomly select the appropriate number of teachers and support staff, together with electronic versions of survey invitation letters and (where required) paper versions of the questionnaires for support staff. A copy of the key contact instructions is provided in Annex C and an example of an invitation letter (the version for teachers) in Annex D.

Survey fieldwork was carried out between 27 February 2023 and the 6 April 2023, with paper questionnaires accepted up until 12 April 2023.

Reminder calls and emails were made to survey key contacts in schools during the fieldwork period and support offered by ScotCen for any schools needing guidance on the steps required to take part. The first reminder call was to ascertain if key contacts had received the survey pack and to remind them of the key tasks involved and the fieldwork period. The second call was made a few weeks into fieldwork to contact schools where less than 10% had completed the survey by this time.

Many of the questions ask about staff experiences over the last full teaching week, though a sub-set of questions asked about the number of incidents of serious disruptive behaviour against them in the last 12 months. The experiences of individuals will, to some extent, vary from week to week (e.g. in some weeks they may experience more positive behaviours than in others). However, the large sample size means that these variations should offset one another – those who experienced more positive behaviours than they usually do in the last teaching week are balanced by those who experienced fewer positive behaviours than they usually do. So, while the reports from some respondents will be ‘atypical’ for them as individuals, the overall picture of behaviour in schools across Scotland will be accurate. There may be some seasonal fluctuation in behaviours (e.g. relating to the weather, if it is towards the beginning or end of a term or the timing of exams). Fieldwork for this wave started slightly later than in previous waves[34] and closer to pre-exam time and the Easter break. It should be noted that the fieldwork period coincided with a period of industrial action by school staff including several days of national and regional teacher strikes resulting in school closures. This may have had some impact on the reported experiences of some staff. The fieldwork period was extended to help account for disruption as a result of industrial action by school staff at the time and the change to the fieldwork period might have also had some impact on reported experiences of pupil behaviour.

Response rates

The response rates are shown in Table 3.1 below. The overall response rate in 2023 was 43%. This is down from 48% in 2016. It had been anticipated the recent COVID-19 pandemic and industrial action among school staff would impact on response. However, this trend does also reflect a wider decline in response rates on almost all major social surveys in the UK and internationally over the last 10-15 years[35]. The response rate in 2016 had also seen a fall since 2012 which may have been partly due to the switch to online, competing demands among school staff and reduced capacity (including the loss of some posts) at the LA level[36]. Response rates had notably risen between 2009 and 2012 which may have been due to improved pre-survey publicity; the efforts of local contacts to encourage schools in their area to take part (particularly from Positive Behaviour Team link officers); the introduction of telephone calls to headteachers at the recruitment stage and the introduction of key contacts in schools.

Differences in response rates between teachers and support staff have remained fairly consistent with the proportions achieved in 2016, taking into account the 5 percentage point reduction in overall response. The response rate among headteachers has fallen more substantially below the rate in 2016. This may be, at least in part, due to the fieldwork having taken place closer to pre-exam time and the Easter break and due to the impact of industrial action among school staff immediately before and during the beginning of fieldwork.

Table 3.1: Response among primary and secondary school staff in 2023 and previous waves
Staff category 2023 selected sample 2023 achieved sample[37] 2023 response rate 2016 response rate 2012 response rate 2009 response rate
Primary headteachers 508 223 44% 58% 73% 57%
Primary teachers 1514 669 44% 47% 69% 43%
Primary support staff 1029 452 44% 47% 69% 45%
Secondary headteachers 330 134 41% 53% 70% 65%
Secondary teachers 3906 1689 43% 46% 61% 43%
Secondary support staff 1442 587 41% 47% 60% 52%
Total 8729 3754 43% 48% 64% 47%

Overall, 525 schools took part in the survey out of the 838 schools invited.

All teacher and headteacher completions were carried out online. More support staff than expected completed the survey on paper, with around half (51%) of total support staff completions being carried out this way (Table 3.2). The paper completion rate is considerably higher than in 2016[38]. Whilst support staff were encouraged to participate in the 2023 survey online where possible, it was considered important to give them the option to complete the survey on paper should they prefer this for convenience, privacy or other reasons. There were some differences in the wording of the support staff materials (the paper questionnaire is provided in Annex B).

Table 3.2 Number of support staff completions by questionnaire mode
Support staff survey completion mode No. % of total completions
Web 510 49%
Paper 529 51%
Total 1039 100%


The survey data was weighted to control for the effects of sampling and to ensure the achieved sample more closely matched the population of schools and staff. The weighting method consisted of two stages: development of a pseudo-selection weight and calibration of the weight to population estimates of:

  • Staff role (head teacher, teacher, or support staff)
  • School type (primary or secondary)
  • Sex (male or female, head teachers and teachers only)
  • Working status (full-time or part-time, teachers only)
  • Contract status (temporary or permanent, teachers only)

The survey weighting has brought the weighted data close to population estimates, thus the survey data presented in this report is representative at a national level. Further detail of the weighting is provided in Annex G.


Where differences between 2016 and 2023 and between sub-groups are reported, they are statistically significant at the 5% level. Any tables or figures showing sub-group differences only present variables where the difference was statistically significant. Statistically significant changes are referred to throughout the report as being ‘higher’ or ‘lower’ or an ‘increase’, ‘decrease’ or ‘decline’. The exception is Chapter 7 which uses the term ‘significant’ or ‘statistically significant’ in relation to predictors of pupil behaviour as it is presenting the findings of a regression analysis, which differs to other chapters.

Weighted datasets for previous waves were unavailable to the research team. As such, similar to 2016, a guide was developed to help determine whether changes in survey estimates between 2016 and 2023 were statistically significant (see ‘Notes to tables’ in the supplementary tables for Chapter 4 and 5). Calculations of statistical significance included an estimated design factor associated with data from 2016. In reality, this design factor is likely to be an over-estimate for some variables and an underestimate for others. Reported statistical significance since the 2016 survey should therefore be treated only as an indication of such. Differences between 2016 and 2023 that are close to statistical significance are also noted throughout the report. In the absence of weighted datasets for each previous wave of the survey, the longer-term trends since 2006 has focused on patterns of change over time rather than on statistically significant changes. Figures on the longer-term trends were taken, where available, from previous published reports[39], [40], [41], [42]. Due to some of the 2006 figures not being available some were taken from the 2016 longer term trends charts and may therefore be 1-2 percentage points different from the original figures. Given the gap between the 2016 and 2023 wave of the survey, which has been twice the length of the gap between most previous waves, there is a gap shown in the x-axis of the charts in the longer-term trends section of this report.

Figures presented within the report on responses to individual questions, or within sub-groups, may sometimes add up to 99% or 101% due to rounding. The 2023 figures including decimal places are presented in the supplementary tables. The 2016 figures in the supplementary tables do not present decimal places as these are taken directly from the 2016 tables which were rounded to 0 decimal places.

For some of the changes since 2016 discussed in the Impact of Behaviour and Approaches used in schools chapters ‘Don’t know’ responses were included for both 2016 and 2023 to allow comparison. This was due to these being included in the 2016 tables. As ‘Don’t know’ would typically be excluded from these types of survey questions, the 2023 findings presented at the beginning of these chapters do not include this so some of these percentages may therefore slightly differ to the 2023 figures presented in the changes over time sub-section that follows. The Supplementary tables for these two chapters (Chapter 8 and 9) show both the figures excluding the ‘Don’t know’ category (just for 2023) and the figures including it (comparing 2016 and 2023).

Qualitative research with headteachers, teachers, support staff and local authority representatives

A programme of qualitative research was conducted between February and July 2023 to add context and detail to the survey findings and explore new and emerging issues in depth. The qualitative research comprised interviews with headteachers and teachers and focus groups with classroom-based support staff at 14 schools (6 primary schools, 8 secondary schools), and interviews with 30 local authority education representatives. Qualitative research with parents and pupils was not conducted as part of this study.

Interviews with local authority representatives

Interviews were conducted with representatives of 30 out of the 32 local authorities in Scotland between February and July 2023. The Director of Education for each local authority identified the most appropriate senior member of staff who had both strategic responsibility for, and a good overview of, policy and practice on behaviour in schools, meaning that LA representatives working in a number of different roles were interviewed. These included those working in and leading inclusion services, quality improvement services and educational psychology. These individuals were invited to take part in an interview. All interviews took place online and were conducted by a member of the ScotCen team.

Interviews lasted around one hour and were structured around a topic guide (Annex E). Interviews were audio-recorded, with the participants’ consent, and transcribed.

Fieldwork in schools

ScotCen staff undertook visits to 14 schools (8 secondary and 6 primary schools) during the summer term (April – June) of 2023. Visits to 13 schools were conducted in person by members of the research team and one was conducted online.

Schools were sampled from those that had taken part in the quantitative survey and where headteachers had given their consent to be contacted about further research. Sampling was conducted to ensure the inclusion of a range of schools varying on a number of factors including school size, deprivation (based on SIMD quintiles of catchment area), frequency of types of behaviour as reported in the survey, rurality, local authority, proportion of pupils with additional support needs, proportion of pupils for whom English is an additional language and proportion of pupils from a Black and minority ethnic background.

Selected schools were recruited by email and phone call. The timing of the fieldwork in the run up to the summer holidays meant that staff were very busy and recruitment was challenging. Headteachers in participating schools were asked to circulate details of the research to their staff, arrange a quiet, private space for the fieldwork to take place and schedule time slots for those staff to meet with a member of the ScotCen research team. In each school, the research team conducted an interview with the headteacher, interviews with 3-4 teachers and a focus group with classroom-based support staff (range 2-6 support staff; mean 3.75 per group). Verbal consent was collected from all participants at the time of the interview or focus group. The number of participants in the school fieldwork in shown in Table 3.3.

Table 3.3 Number of participants in the qualitative school fieldwork
Staff type Primary Secondary
Headteachers 6 8
Depute headteachers - 1
Teachers 15 31
Support staff 19 29
Total 109

A flexible approach was offered to support schools to participate in the research. Some schools experienced challenges in releasing staff within the school day in the structure set out above. Therefore, the following exceptions were made:

  • In one school, individual interviews were conducted with support staff
  • In one school, a focus group was conducted with teachers
  • In one school, the headteacher and depute headteacher were interviewed together
  • One school was unable to release staff to take part and only the headteacher was interviewed.

The majority of fieldwork was conducted in-person on the school premises. Online interviewing was used on a small number of occasions to allow staff with other commitments on the day of the school visit to take part, for example, in instances of staff absence.

Interviews and focus groups were timed to fit around school periods and were between 45 minutes and an hour in length, and were structured around topic guides (Annex E). All interviews were audio-recorded, with the participants’ consent, and transcribed.

All participating schools were offered a £100 donation to school funds as a thank you for participating in the research.

Analysis and interpretation

To systematically manage the qualitative data collected, NatCen’s Framework approach[43] was used in NVivo 12 and Microsoft Excel. A coding frame was developed to code the data to a number of categories. Within each category, a matrix was created, where each row represented a participant and each column a key theme. All available qualitative data was then summarised within the matrix. Once the raw data for each strand has been coded, it was analysed using a mixed deductive/inductive approach to thematic analysis.

Limitations of methodology

As noted earlier in this chapter, it is possible that the timings of the survey fieldwork for this wave (starting slightly later than in previous waves and closer to the pre-exam time and Easter break) might have had some impact on reported experiences of pupil relationships and behaviour. The coincidence of the survey fieldwork with a period of industrial action by school staff (including several days of national and regional teacher strikes resulting in school closures) may also have had some impact on the reported experiences of some staff.

There were limitations in relation to the contact strategy with schools immediately prior to and during the fieldwork period that could potentially be improved upon in future waves. For a number of schools, it was not possible to reach the nominated key contact via the contact details provided. Therefore, the survey materials were sent to the headteacher directly and email reminders to the general school administrative team. Some schools reported later in fieldwork that the key contact did not receive the pack on time which had some impact on response rates. For future waves there should be consideration of additional approaches for communicating with schools in which headteachers had opted to take part but had subsequently not responded to contact.

In this wave of the survey, as one measure to maximise overall response, key contacts in the schools could either hand out paper copies of the survey invitation or email it to the selected staff. The email invitations did not contain an individual-level access code for the survey due to privacy issues and the need to simplify the process. Instead, a question was included in the survey to record which school each respondent worked in and their staff type in order to compare this with the number of expected survey completions from each school to inform weighting. In taking this approach there is a small risk that some respondents chose the incorrect school from the drop-down list in the survey and a possibility of some over-sampling if an individual responded more than once or more staff were invited to undertake the survey than was stated in the survey instructions for that school. The level of potential impact this might have had on survey response has been reviewed and assessed as minimal.

There are limitations with regards to the survey findings on staff experiences of abuse from pupils directed towards them and other staff due to protected characteristics. As staff demographics relating to protected characteristics other than gender[44] were not captured by the survey it is not possible to ascertain whether the study accurately reflects the experiences of these demographic groups. Whilst we would expect proportionate representation of these demographic groups within the sample given the approach taken to sampling[45], it is not possible to check overall representation of each demographic group against the staff population within schools in Scotland. It is possible, though unlikely, that particular demographic groups might have been less likely to participate in the study for various reasons. It should also be noted that the prevalence of experiences reported by staff in the survey may be lower than the actual prevalence of experiences of such behaviour. Reporting of experiences may be influenced by a range of factors, for example desensitisation if certain types of behaviours are being experienced often enough to be normalised or sometimes forgotten. Further research with school staff from different demographic groups including those with protected characteristics would be beneficial to explore the experiences of these staff.

As noted within Analysis, data from 2016 and previous waves were taken, where available, from previous published reports, due to previous weighted datasets not being available. This has limited the level of analysis of the longer-term trends to reporting on the overall pattern rather than on statistical significance and there are some caveats (as noted in Analysis) of the approach for interpreting significance of differences observed since 2016. There are several longer-term trends charts in Chapter 5 for which it has not been possible to find exact figures for earlier waves directly from published reports; these have had to be taken from the 2016 figures which do not contain data labels and therefore some figures may be 1-2 percentage points different to the data. This will likely have minimal impact upon the patterns reported across the time series.

As noted in the introduction, a key strength of the BISS survey is the continuity of the time series since 2006. However, maintaining the time series through using the same questions, some of which were developed prior to 2006 or 2009, means that the terminology used in some questions is now out of date. Whilst this was noted by some staff in the survey pilot and in the qualitative research, this is unlikely to have impacted on the quality of the survey data. It could be beneficial to review whether any adaptations can be made for future waves whilst maintaining the key time series data. This is considered further in Recommendations for future iterations of the BISSR study in Chapter 11 Discussions and Conclusions.

It is possible that schools who engaged with the qualitative research were more likely to be those which were not experiencing staff shortages or time challenges, or which were coping well despite these challenges. However, it is important to note that many of the participants in the qualitative research described similar on-going challenges within their schools.

As befits a case study approach, the views of those in the 14 schools that took part in the qualitative phase are not representative of all schools across Scotland. However, they were purposively sampled from the 153 schools that took part in the survey element and where headteachers gave their consent to be contacted about the qualitative research.

Due to the project budget, there was no scope to include the views of pupils and parents in this iteration of the BISS study. This should be considered for future waves as considered in the recommendations within the Discussion of this report.



Back to top