Overall, 264 out of an invited 382 schools participated in the survey. This equated to 1,182 out of an invited 1,756 classes and 25,304 out of a possible 31,147 pupils who participated in the survey. All pupils who completed the survey were included in the analysis of results. However, due to differences in sampling, the schools completing the RCS boost were not included in assessment of the overall response rate, discussed in more detail below. The response rate is based on a total sample of 16,911 pupils.
In the sample included in the response rate, 235 of the 351 invited schools took part in the survey, giving a school response rate of 67%. The class response rate was 61%, 822 classes out of the original sample of 1,345. Overall 16,911 pupils completed the questionnaire, exceeding the target sample of 16,000. Based on the class response forms  sent out to participating schools, this equated to a pupil response rate of 87%.
Prior to 2002, the survey was conducted across the whole of the UK and not just in Scotland. While in previous years the response rate was calculated as the product of the school response rate and the pupil response rate, this changed in 2002 when the Scottish survey became separated from the English and Welsh survey. From this point on, the overall response rate was calculated as the product of the class response rate and the pupil response rate, with the exception of 2006. The overall response rate in 2015 was 53% (Table 1 and Figure 1).
Figure 1: Response rates for SALSUS and predecessors: 2002-2015
Source SALSUS 2002-2015
N.B. The response rate in 2006 was calculated in a different way than other years of SALSUS and is therefore incomparable. For this reason, it has been excluded from the response rate time chart.
Table 1: Response rates for SALSUS and predecessors: 1982-2015
|Survey year||School response rate||Class response rate||Pupil response rate||Overall response rate|
The overall response rate has dropped since 2013. This was due to a lower than expected response rate among schools completing the survey online. The response rate for the paper sample was broadly in line with that for 2013.
Most surveys are subject to possible bias due to non-response. Within this survey there were several possible reasons for non-response to occur: school and class non-response; pupil non-response; and item (question) non-response. The impacts of non-response bias can be addressed through the use of weights which is discussed later in further detail.
School and Class Non-Response
The extent to which school non-response leads to bias in the survey results will depend on the extent to which this leads to a systematic under-representation of schools with particular features, where those features are linked with the variables the survey measures. For example, smoking prevalence can be higher at schools with a high proportion of pupils living in areas of greater deprivation.
The overall school response rate was 67%. Table 2 presents a comparison of the sample with pupil census information to allow assessment of the existence of non-response bias. This shows that the sample was representative in terms of school denomination and whether the school was independent or not.
However, there was some under-representation of S4 pupils (46% in the sample, compared with 51% of the population) and over-representation of 13 year olds (54% in the sample, compared with 49% of the population).
Pupils in the sample were also more likely to be in rural areas than the population profile (22% in the sample, compared with 18% of the population) which could be indicative of bias.
There did not seem to be any other obvious differences between the schools that participated and those that did not (e.g. size of school). However, it is not possible to examine or quantify all potential sources of non-response bias. For example, it may be that schools that place a higher priority on substance use education may be more likely to take part. Schools that place a higher priority on substance use education may do so because it is more of problem among their pupils. In this case the survey results may be biased by over-representing pupils who use substances. Alternatively, if the education is effective, the survey results may be biased by under-representing pupils who use substances.
Table 2: Comparison of sample profile with pupil census information
|2015 Unweighted Sample %||2015 Pupil Census %|
Pupil non-response within classes resulted from illness on the day of the survey, other absence (this could be authorised or unauthorised) or refusal (either from the pupil or the parent). In order to maximise the response from pupils and to limit any bias, teachers were asked to administer questionnaires for absent pupils at a later date. This led to a very high pupil response rate of 87%.
Item non-response is where respondents do not answer some questions. If the item non-response is systematic in any way, i.e. if there is a reason why some groups of respondents are less likely than others to answer a particular question, there is the potential for bias in the results.
The level of item non-response in the survey was generally low. Of 90 questions in the survey, just seven had levels of non-response of 10% or more, see Table 3. Therefore, item non-response is unlikely to have greatly affected the results.
The level of item non-response was considerably higher in the online mode than in the paper mode and this is discussed in detail in the Mode Effect Study Results report  . However, the differing levels of item non-response did not have a statistically significant impact on the results of the key prevalence measures (and, consequently, has not had an impact on the trends).
By far the highest level of item non-response was at the postcode question. It was high in both modes but, again, higher in the online mode than the paper mode. It was clear from the qualitative research with pupils, conducted during the online pilot, that there was considerable concern that providing their postcode (particularly in combination with their month and year of birth) would make them identifiable. This may explain the higher non-response for postcodes for the online mode. However, it is unclear why this would be more of a concern for those completing the survey online than on paper. One possibility could be a sense that electronic data is more easily manipulated and matched up with other electronic data. Although not identified in the pilot, another possibility is that pupils may be aware of warnings (from school and elsewhere) to be very careful about the types of personal data they submit online.
Table 3 Item non-response where proportions were equal to or greater than 10%
|Q46 - How much do you think your father/carer really knows about…?||How you spend your money||16,911||10%|
|Where you are after school||16,911||10%|
|Where you go at night||16,911||10%|
|What you do with your free time||16,911||10%|
|Q73 - How old were you when you first?||Got drunk||16,911||19%|
|Q78 Do you know the postcode for your home address?||16,911||43%|
|Q84 - To what extent do you agree or disagree with the following statements? 'My school provides me with enough advice and support about…?||Drinking alcohol||16,911||10%|
|Leading a healthy and active life||16,911||10%|
|Q85 - Thinking about the future, how confident do you feel about…?||Saying no to doing something that you don't want to do||16,911||10%|
|Knowing where to go for information and support about substance related issues||16,911||10%|
|Avoiding getting into risky situations due to alcohol||16,911||10%|
|Avoiding getting into risky situations due to drugs||16,911||11%|
|Q88 - In the past year, how many times did you skip or skive school?||16,911||10%|
|Q90 Strengths and Difficulties Questionnaire||All items||16,911||11-14%|