Scottish Household Survey 2020: methodology and impact of change in mode

The methodology report for the Scottish Household Survey 2020 telephone survey which discusses the impact of the change in mode.

This document is part of 2 collections


Executive Summary

The Scottish Household Survey is an annual survey carried out since 1999. It collects data on a wide range of different topics not available from any other sources, and is at the heart of the Scottish Government's evidence-based approach to policy. The social survey uses face-to-face in-home interviewing. It is followed by the physical survey, dwelling inspections carried out by a surveyor team. The physical survey provides national estimates of the energy efficiency and the condition of the domestic housing stock and of fuel poverty.

In March 2020, fieldwork was suspended in response to the Covid-19 pandemic. Only a small proportion of the 2020 survey had been completed. The approach was adapted and the remainder of the 2020 social survey fieldwork was carried out using remote interviewing. The dwelling inspection fieldwork remained suspended.

This report describes the adaptations to the methodology for the 2020 social survey, and explores the impact of the change in approach on the survey estimates.

Adapting the approach for the 2020 social survey fieldwork

Until the pandemic, all interviews were undertake in-home face-to-face. Householders were sent an advance letter and leaftet in advance of an interviewer calling. Interviewers were required to make multiple visits to secure an interview at a sampled address. A sizeable proportion of addresss where the first interivewer did not secure an interview were revisited by another interviewer. This approach helped ensure that the Scottish Household Survey has achieved a consistently high response rate. No respondent incentives were used.

The interview averaged 60 minutes, the first part with a householder and the second with a random adult in the household. A wide range of topics were covered including the composition, characteristics, attitudes and behaviour of Scottish households and individuals.

The revised approach used the addresses that had not been worked when the interviewing was suspended. Telephone matching was undertaken to allow interviewers to try to get agreement to interview from some addresses by telephone. This involved matching names and telephone numbers to addresses using publicly available sources, such as the electoral register and the telephone directory. Matching was successful for 23% of addresses.

With no interviewer travel allowed, gaining consent for interview came either from respondents opting-in on receipt of the advance materials, or in response to an approach by telephone. After the initial mail-out, addresses where a phone number had been obtained were followed up by a telephone call. For those where we were unable to obtain a telephone number, two reminders were sent after the initial mail-out, a postcard reminder followed by a final letter reminder.

Respondents were given a conditional incentive of £20 for completing the interview, to encourage participation.

All interviews were undertaken remotely, either by telephone or video link. Video link interviews used one-way Microsoft Teams, where the respondent could see the interviewer but where the interviewer could not see the respondent. Most interviews were conducted by telephone.

Fieldwork was undertaken by interviewers from the SHS face-to-face interviewer panel. Fieldwork for the pilot was undertaken in Oct 2020 and the main stage was undertaken between January and April 2021.

Where possible, questions, response options and format were kept the same as the face-to-face survey. Some adaptations were necessary, especially to questions that relied on showcards.

The weighting strategy was updated to mitigate against the impact of different patterns of non-response.

Impact on estimates

As the SHS has used a broadly consistent approach since its inception until the pandemic, any biases or errors are likely to have been consistent across time. This means that changes in results year-on-year are likely to have reflected real changes.

Any change in approach means that, in addition to any real change, estimates may be affected by a) changes to the profile of the responding sample (non-response bias) and/or b) changes to how questions are asked and answered (measurement error).

Change to the profile of the responding sample

Overall, where response rates are lower, there is greater potential for non-response bias.

The unadjusted[1] overall response rate achieved using the revised approach was 20%: 14% for the opt-in only sample, and 37% for the telephone matched sample. This compares to a response rate in 2019 of 63%. The revised push-to-telephone/video approach not only resulted in a lower response rate overall, but there was considerably more variation across different types of area compared to the face-to-face in-home approach. Response rates were particularly low among those in the most deprived areas.

Despite this, among most survey measures where major changes would not be expected, the estimates for 2020 were in line with those from 2019. However, there were a number of estimates where it is less plausible that the change from 2019 reflected a real change over time.

Among the household measures these were tenure (with an increase in owner-occupiers and a decrease in social renters) and length of time at current address (with an increase in the proportion who had lived at their address for over 15 years).

Among the random adult measures, highest educational attainment and satisfaction with local health services showed large differences compared to 2019. The increase in respondents with a degree or professional qualification is likely to be due to a different pattern of non-response compared to previous years. The increase in satisfaction with local health services could be genuine, resulting from the increased appreciation for the NHS that we have seen during the pandemic. However, it could also be driven, at least in part, by the change in approach.

For a range of other measures – such as the proportion of people feeling lonely, being able to rely on neighbours, cultural attendance and visits to the outdoors – there were notable changes from 2020. These changes were all plausible and could be attributable to the impact of the pandemic, although we cannot discount that the change in approach has had some impact on comparability.

The estimates from the telephone matched sample were further from the 2019 figures than those from the opt-in sample, with younger higher income householders, those in social rented and private rented housing, and those who have lived in their current address for a short period of time, under-represented. Despite the response rate for the opt-in sample being considerably lower than the telephone matched sample, the estimates generally appear closer to those from the 2019 wave. The one notable exception is educational attainment, where the opt-in sample appears further from the 2019 estimates than the telephone-matched sample. The opt-in sample over-represented those with degree level qualifications compared with 2019.

Change in relation to how questions are asked and answered

With no interviewer travel allowed, interviews had to be undertaken remotely, either by telephone or by video. Overall, 16% of household respondents undertook the SHS interview by one-way video link, and 84% by telephone. Younger householders, those working, and those in privately rented accommodation, were more likely to undertake the interview by video.

The impact of mode on measurement error – how people respond to questions and whether their measured responses were accurate – is complex and difficult to disentangle from response patterns.

Mode of interview also differed considerably by mode of approach. For the opt-in only sample, 22% undertook the household interview by video. In contrast, only 8% of those in the telephone-matched sample did likewise.

The design of the questionnaire was not optimal for interviewing by telephone or video. It has relied on interviewer facilitation to maximise participant engagement. The main challenge was how to adapt questions that relied on showcards. If interviewing using video, interviewers could use showcards via screenshare. However, an alternative strategy was needed for the telephone interviews. Where a question was factual (e.g., ethnicity and educational qualifications), interviewers were instructed to read the question, wait for the respondent to answer, and then select the corresponding code(s). For questions where the range of response options that were not obvious from the question, the interviewer was directed to read out all the response codes along with the question.

As well as differences in visual cues given through the showcards, there are a number of ways in which the revised modes of approach may have differed from the in-home, face-to-face approach, in relation to the relationship between interviewer and respondent. These include the level of trust built, level of attention throughout the hour-long interview, ability of interviewers to pick up visual cues that questions have been misinterpreted, and whether other people in the household were influencing what answers were given.

On a variety of measures examined, there did not appear to be any differences by mode of interview. However, evidence of a mode effect was found in a number of estimates, such as:

  • Educational qualifications. Video interviews appear to measure the full list of educational measures held better than other modes. This was probably due to differences in visual cues given.
  • Components of income. Interviews conducted by video had less missing data compared to interviews conducted by telephone.
  • Cultural attendance, cultural engagement, and sports participation. Estimates for these measures were higher among those interviewed by video than among those interviewed by telephone. This appeared to be independent of any impact of the different sample profiles.
  • Use of agree/disagree scales on questions on council services. There were fewer neutral responses (neither agree nor disagree and don't know) in telephone interviews than in video interviews. This is likely to be due to differences caused by showcards.

Despite efforts to minimise measurement error, the analysis suggests that the mode of interview is likely to have had some effect on some estimates.

Conclusions

Most estimates were consistent with previous findings, or show changes that were plausible and could be attributed to the impact of the pandemic. On the other hand, the analysis found evidence of changes to estimates of a number of key measures, which appear to be driven by the change in approach.

This means that it is not possible to determine the extent to which any differences between 2020 and previous years represent genuine changes in people's views and experiences, as opposed to being due to changes in how the survey was carried out.

Difficulty in making comparisons between the 2020 survey and previous years does not mean that the data from the 2020 SHS is poor quality. Mode effects do not necessarily imply changes in data quality and examining results and breaking analysis down by variables within the survey is robust: it just cannot be compared with previous trend data.

All surveys are subject to different types of error and bias that cannot be fully addressed through weighting. Consistency of approach year on year helps to ensure that one year's results can confidently be compared to the next. In 2020, the pandemic forced the survey to change approach.

The results also provide evidence to feed into consideration of changing the approach for the survey in the future and adopting innovative methods. Any revised approach to the SHS needs to be robust over the long-term, as a change of approach may introduce an additional break in the time-series, making it difficult to compare results over time. The likely impact on the representativeness of the sample and the impact of mode(s) of interview on measurement error should be considered as part of any potential move away from in-home interviewing to remote interviewing. And any cost savings should be weighed against any likely impact on the accuracy of estimates.

Contact

Email: shs@gov.scot

Back to top