Scottish Household Survey 2020: methodology and impact of change in mode

The methodology report for the Scottish Household Survey 2020 telephone survey which discusses the impact of the change in mode.

This document is part of 2 collections

Chapter 7: Summary and conclusions


Since 1999, fieldwork for the Scottish Household Survey has been conducted annually, with interviews undertaken throughout the year. It has used random pre-selected face-to-face interviewing in people's homes. In early 2020, the Covid-19 pandemic hit and fieldwork was suspended. A revised push-to-telephone/video approach was developed, piloted and adopted for the remainder of the 2020 sample. This approach involved no interviewer travel, and surveys were conducted remotely either by telephone or by video.

The change in data collection method from the traditional face-to-face interviewing to the push-to-telephone/video approach has the potential to change the accuracy of the estimates and introduce discontinuity into the data series.

  • The revised design relied on approaching respondents in a different way from previously. Instead of interviewers visiting addresses face-to-face and persuading people to take part in conversation on the doorstep, either a) people opted-in via an online portal in response to advance letters or b) interviewers attempted to get agreement by telephone for the portion of the sample for which telephone numbers had been successfully matched to the sampled address. The change in mode of approach may have shaped the profile of people who agreed to take part.
  • Additionally, the mode by which interviews were undertaken also changed. All interviews pre-lockdown were conducted face-to-face in-home. With no interviewer travel allowed, interviews in the revised design were conducted either by telephone or by one-way video (so that the respondent could see the interviewer, but the interviewer could not see the respondent). The change in mode of interview may have shaped how people respond to questions.

The unadjusted[40] overall response rate achieved using the revised approach was 20%, 14% for the opt-in only sample, and 37% for the telephone matched sample. This compares to a response rate in 2019 of 63%. There was considerably more variation in response rates across different types of area compared to the face-to-face in-home approach. Response rates were particularly low among those in the most deprived areas.

After calibration weighting, for most measures where major changes would not be expected, the estimates were in line with those from 2019. However, there were a number of estimates where the level of change is less likely to reflect a plausible change over time. Among the household measures, these were tenure (with a sizeable increase in owner-occupation and a sizeable decrease in social rented housing) and length of time at their property (with an increase in the proportion who had lived at their address for over 15 years). Among the random adult measures, highest educational attainment and satisfaction with local health services showed large differences compared to 2019.

For a range of other measures – such as an increase in the proportion of people feeling lonely, being able to rely on neighbours, cultural attendance and visits to the outdoors – there were notable changes from 2020. However, although we cannot discount that the change in approach has had some impact on comparability, these changes were all plausible and may well be due to the impact of the pandemic.

Comparing the quality of estimates from the telephone-matched sample and the opt-in sample is difficult, due to different coverage (because of differences in the profile of addresses where we were able to obtain a matched telephone number). However, after corrective weighting, most estimates from the opt-in only sample tended to be closer to the 2019 estimates than those for the telephone-matched sample. The sample with matched telephone numbers particularly under-represented younger householders, those in social rented and private rented housing, and those who have lived in their current address for a short period of time. (This is likely to reflect patterns in land-line usage in the telephone-matched sample). There is one notable exception, educational attainment, with the opt-in only sample over-representing those with degree level qualifications.

Analysis of the impact of mode effects on who responds is also challenging, partly because of the various ways that mode can frame the interviewer-respondent relationship and because these effects are difficult to untangle from changes to the sample profile. Overall, 16% of household respondents undertook the SHS interview by one-way video link, and 84% by telephone. Younger householders, those working, and those in privately rented accommodation were more likely to undertake the interview by video.

On a variety of measures examined, there did not appear to be any differences by mode of interview. However, evidence of a mode effect was found in a number of estimates. This suggests that, despite efforts to minimise measurement error, the mode of interivew is likely to have had some effect on some estimates


All surveys are subject to various different types of error and bias that cannot be fully addressed through weighting, such as non-response bias and differences in how questions are answered that are framed by survey mode. Consistency of approach year on year helps to ensure that one year's results can confidently be compared to the next. In 2020, the pandemic forced the survey to change approach.

The analysis found that most estimates were consistent with previous findings, or showed changes that were plausible and could be attributed to the likely impact of the pandemic. It also found evidence of changes to a number of estimates of key measures that appear to be driven by the change in approach.

This means that it is not possible to determine the extent to which any differences between 2020 and previous years represent genuine changes in people's views and experiences, as opposed to being due to changes in how the survey was carried out. However, difficulty in making comparisons between the 2020 survey and previous years does not mean that the data from the 2020 SHS is poor quality, as mode effects do not necessarily imply changes in data quality.

More widely, the results provide evidence to feed into consideration of changing the approach for the survey in the future and adopting innovative methods.

The response rate for the revised approach was considerably lower than the previous approach. Respondents were more likely to be older, living in less deprived areas, and in owner-occupation. Differences in response rates across different types of areas were larger. While data is weighted to try to mitigate against these effects, is it likely that an approach with a considerably lower and variable response rate is likely to result in greater non-response bias and poorer quality of estimate.

Face-to-face approaches are better at including 'harder to reach' respondents, such as those who are less affluent and less educated. At the heart of this is the role interviewers play in persuading people to take part in surveys, particularly reluctant respondents who are unlikely to take part in opt-in only surveys – those with lower literacy skills, those with busy and/or chaotic lifestyles, those who are wary of divulging information about themselves, those who are less civically engaged, and those who are less research-literate. Weighting will, at best, only partially mitigate against this bias.

Moreover, these types of respondent are important not only for accuracy of survey estimates. They are also often the groups public policy initiatives are intended to reach, and of high interest to policy makers and survey analysts.

A 'knock-to-nudge' approach – where interviewers visit addresses to attempt to persuade people to take part face-to-face, but conduct the suvey interview remotely – is likely to help ensure that more people from 'harder to reach' groups respond. This approach was not possible in 2020, because of the public health guidelines.

More generally, the findings reinforce some of the other reasons why face-to-face fieldwork has been considered the gold standard of suvey methods. Compared to standard telephone and online surveys, there is very little coverage error, and this is likely to be stable over time. Interviewers can record deadwood. There is flexibility in where to target fieldwork effort, and the ability to target resources in ways to increase precision and minimise bias. And during the interview, interviewers can act as a deterrent against respondents giving answers that require minimum effort.

With regard to mode effects and measurement error, there was some indication that interviewing by video provides more accurate estimates than interviewing by telephone – particularly with questions that rely on showcards, and those with a sizeable number of response categories. If a mixed-mode approach was employed, these impacts might be mitigated by additional questionnaire testing and development to minimise variation by mode. However, any such adaptations would also impact comparisons over time.

Additionally, the trade-off between non-response bias and measurement error in mode choice needs to be considered. Choice in how people take part will encourage participation but may lead to differences in measurement error. There is a tension between making sure that reluctant and busy respondents take part and ensuring that their responses are as accurate as possible. Video interviewing does appear to lead to more accurate estimates. However, so far only a small proportion of people have undertaken the survey via this mode. A choice of modes may help encourage participation overall but may also lead to differences in measurement error. While remote modes could be combined with an in-home face-to-face approach (to provide a 'Covid-secure' approach for people who are uncomfortable undertaking the survey face-to-face), there are trade-offs to be made in how much flexibility is provided and how much the preferred mode is incentivised.

Any revised approach to the SHS needs to be robust over the long-term. A change of approach may introduce a break in the time-series, making it difficult to compare results over time. The likely impact on the representativeness of the sample, and the impact of mode(s) of interview on measurement error, should be considered as part of any potential move away from in-home interviewing to remote interviewing. And any cost savings should be weighed against the likely impact on the accuracy of estimates.



Back to top