3 Did the mode affect the representativeness of the survey?
This chapter first looks at the response rates for the online and paper samples before going on to discuss the impact this had on the sample profiles.
In survey research, higher response rates tend to increase representativeness i.e. the extent to which the achieved sample reflects the population of interest (in this case, 13 and 15 year olds across Scotland). While this is generally true, lower response rates are only problematic if there is something systematically different about those who respond to the survey compared with those who do not in a way which biases the results.
This is particularly important for the mode effect study as, in order to identify a mode effect, the online and paper samples needed to be comparable. If they were not, then it would not be possible to know whether any differences in the key measures were due to the mode or to a difference in the sample profiles.
Response rate - key findings
- The school, class and overall response rates were lower in the online sample than in the paper sample
- There were no differences in pupil response rate between the online and paper samples
- Differences in response rate between the online sample and the paper sample varied by local authority
School response rates
At the start of recruitment, 175 schools were invited to complete the survey online and 176 schools were invited to complete the survey on paper. Eight schools did not want to complete the survey online but were willing to do so on paper. As a result, seven schools from the paper sample were swapped to the online sample to replace these schools. This meant that the final online sample contained 174 schools and the final paper sample contained 177 schools.
The school response rate was 72% for the paper sample (128 out of 177 schools) and 61% for the online sample (107 out of 174 schools). This shows a clear difference in the school response rate between the paper sample and the online sample. While the paper response rate is in line with that of the last wave of the survey in 2013, the online response rate is 11 percentage points lower (see Table 3.1). This suggests that the prospect of completing the survey online is less appealing to head teachers and there is potential for lower response rates in future waves if the survey moves online.
Table 3.1 Response rates broken down by mode and year
|2013 paper||2015 paper||2015 online||2015 overall|
|School response rate||71%||72%||61%||67%|
|Class response rate||68%||68%||53%||61%|
|Pupil response rate||90%||88%||87%||87%|
|Overall response rate||60%||60%||46%||53%|
- The school response rate shows the proportion of schools in each sample that agreed to participate in the survey, and then administered the survey to at least one class. It does not include those who initially agreed but did not complete any surveys.
- The class response rate shows the proportion of the total number of sampled classes that participated in the survey. As class participation is dependent on the school completing the survey, it is largely driven by the school response rate. However, it also takes into account where some sampled classes in a participating school completed the survey and others did not.
- The pupil response rate is the proportion of pupils in participating classes who completed the survey.
- The overall response rate is the product of the class response rate and the pupil response rate. As the survey uses a class based sample design, the class response rate, rather than the school response rate is used.
Schools completing the survey online were also less likely to complete the survey once they had agreed to take part than schools in the paper sample. Eighty-seven per cent of schools in the paper sample who originally agreed to participate went on to complete at least one class, compared with 76% of schools in the online sample. This suggests that it is more difficult for some schools to complete the survey online.
Some local authority areas had the same or similar response rates for both modes but others differed greatly (see Table 3.2). It should be noted however that, due to the small number of schools in some local authorities, the participation of even one school can influence response rates greatly. Perth & Kinross, Falkirk and City of Edinburgh all had particularly low online response rates when compared with their paper response rate. In contrast, Scottish Borders, East Lothian and Argyll & Bute had higher online response rates. There was also a higher online response rate in Glasgow. The relatively high number of pupils in Glasgow means that this may have had some impact on the sample profile.
Table 3.2 School response rates by local authority 
|Schools sampled||Schools completed||% completed||Schools sampled||Schools completed||% completed|
|Argyll & Bute||5||2||40%||4||3||75%|
|Dumfries & Galloway||8||3||38%||8||5||63%|
|Edinburgh, City of||17||13||76%||17||6||35%|
|Perth & Kinross||7||5||71%||7||1||14%|
Class response rates
The class response rate is driven to a large extent by the school response rate, so it is inevitable that it is lower for the online than the paper sample (see Table 3.1). 345 out of an invited 649 classes completed the survey online and 477 out of an invited 705 classes completed the survey on paper. This means that, in total, 822 classes out of 1,345 classes completed the survey.
The fieldwork monitoring figures suggest that, in addition to fewer schools participating in the online survey, those that did were less likely to follow up on missing classes. One of the main findings from the electronic pilot was that completing the survey in ICT suites required a greater amount of advanced planning than completing the paper survey. It could be that quickly following up on missing classes in response to a reminder from the survey contractor is less feasible when using an online methodology. Furthermore, some schools had prelims for the National 5 and 6 courses in January (the fieldwork contingency period) which meant that many of the ICT suites had already been booked.
Pupil response rates
7,125 out of a possible 8,231 pupils completed the survey online and 9,786 out of a possible 11,170 pupils completed the survey on paper. This meant that a total of 16,911 pupils out of a total of 19,401 pupils completed the survey. There was no difference in pupil response rate by mode (see Table 3.1).
This next section looks at the sample profile for those completing the survey online, compared with those completing on paper and compared with the national profile.
While the overall response rate is lower for the online sample than the paper sample, this is not necessarily an issue. Lower response rates are mainly a problem if this introduces bias into the results by skewing the profile of respondents in ways that affect the survey measures.
Sample profile - key findings
- There were some differences between the online and paper samples in relation to the Scottish Index of Multiple Deprivation ( SIMD), school sector, school denomination and rurality. However, this disappeared when school clustering was taken into account.
- While the online and paper sample differed from each other to some extent, both were reasonably in line with the national profile
- There was greater variation at a local level, with some areas more likely to complete the survey on paper and others more likely to complete the survey online
There were some differences between the online and paper samples (Table 3.3), but these differences were related to how pupils were clustered within schools. This was mainly due to the fact that schools in some local authorities were more likely to agree to the online survey and others to the paper survey (see Table 3.4). However, once the clustering of schools was taken into account, there were no substantial differences in pupil characteristics as a result of data collection method.
Table 3.3 Sample profile
|1 - most deprived||20.1%||17.8%||25.3%||21.0%|
|5 - least deprived||21.2%||25.1%||22.2%||23.9%|
Analysis was conducted to explore any differences using chi-squared tests. There were no differences between the online and paper sample in terms of the age or gender of pupils. There were statistically significant differences between the two samples in relation to SIMD, school sector, denomination and rurality.
A regression model was used to check whether gender, age, school sector, denomination, rurality and SIMD were significantly related to the probability of having completed the survey online or on paper, while taking into account that pupils were clustered within schools. The results show that these variables did not have a significant impact on the chance of having completed a questionnaire online or on paper.
It is important to note that while the two samples differed on some characteristics, both were reasonably in line with the national profile. For some sample characteristics (rurality, denomination and year group), the online sample was more representative than the paper sample.
Table 3.4 shows the local authority profile of Scotland and of the combined, paper and online survey samples. For most local authorities, the likelihood of completing the survey on paper or online was similar. However, there were a number that showed more notable differences: City of Edinburgh, Perth & Kinross and, to a lesser extent, North Lanarkshire were all more likely to complete the survey on paper than online. Glasgow and, to a lesser extent, Fife were more likely to complete the survey online than on paper.
Table 3.4 Local authority profile 
|Argyll & Bute||1.9%||1.5%||1.6%||1.4%|
|Dumfries & Galloway||3.6%||2.2%||1.8%||2.8%|
|Perth & Kinross||3.4%||4.6%||7.6%||0.7%|
Moving to online administration may result in a reduced response rate but the analyses suggest that this would not affect the representativeness of the sample at a national level. However, a greater impact has been demonstrated at a local level. SALSUS has tended to be run every two years, providing local level results (with a larger sample size) every other wave (i.e. every four years).The local level differences have implications for how the survey is administered depending on whether national results or local level results are required.