Understanding survey nonresponse behaviours: evidence and practical solutions
This report summarises key findings from research to extend understanding of the challenges posed by nonresponse and nonresponse bias in the Scottish Government's general population surveys, and identifies potential solutions.
Summary
Response rates to surveys across many parts of the world have been falling for decades. In Scotland, response to the three main general population, cross-sectional Scottish Government surveys (the Scottish Household Survey (SHS), Scottish Health Survey (SHeS) and Scottish Crime and Justice Survey (SCJS)) fell steadily from 2012 to 2019, driven by increasing numbers of people refusing to take part. Response rates in 2022 and 2023 were lower again compared with pre-pandemic levels.
This report summarises findings from research aiming to extend understanding of the challenges posed by nonresponse and proposes testable potential solutions to these challenges. It focuses on the three main Scottish Government general population surveys, but many of the findings are likely to be relevant to other surveys.
The research included five main components: a desk-based literature review; analysis of response rates to the three main general population Scottish Government surveys; qualitative research with professional survey stakeholders, including survey leads and contractors and survey interviewers; qualitative research with the general public; and a workshop with key survey stakeholders.
The research was informed by the COM-B framework, which identifies three key components underpinning behaviour: capability, opportunity and motivation. As such, findings focus on what ‘drives’ individual respondents to participate in surveys (or not), rather than on wider contextual factors that impact response rates (such as interviewer recruitment).
Barriers and solutions: Capability
- Levels of awareness of surveys are very difficult to measure, but awareness of Scottish Government surveys among members of the public who participated in this research was generally very low. This may contribute to a climate that makes potential respondents less receptive to taking part.
- Potential solution: Targeted advertising to raise awareness of Scottish Government surveys and statistics. This could include location-based advertising (GP surgeries, council buildings, on public transport) and via social media (potentially in different languages).
- Cost/difficulty: Variable depending on nature/level of any campaign.
- Additional reflections: In terms of potential impact, it is important to note that few of those reached through any targeted advertising would subsequently be invited to take part in a Scottish Government survey. However, this advertising could form part of a wider strategy to improve understanding of the nature and value of Scottish Government research and statistics.
- Interviews with the public highlighted the challenge of getting potential participants to open and read advance letters. Letters addressed to the householder may be seen as ‘spam’ and logos on envelopes attract mixed reactions. The SHS advance letter was seen as long and specific elements were felt to be off-putting.
- Potential solutions: Test alternative envelope designs (using different logos/combinations of logos). Review the content of letters in light of feedback in this report.
- Cost/difficulty: Low.
- Additional reflections: Given the letters have been reviewed regularly over time there may be limited scope for significant further improvement, and even with improvements they may not be read.
- Accessibility issues and concerns that might reduce their likelihood of taking part were raised by disabled people interviewed for this research, including issues with the accessibility of the advance letter; concerns about whether the survey questions would be accessible; and concerns about whether survey interviewers would adapt to their needs, including wearing masks and offering flexible appointments.
- Potential solutions: Reviewing advance letters and materials against accessibility standards (low cost/difficulty). Commissioning an accessible design organisation to review advance letters and materials (medium cost/difficulty). Highlighting accessibility options more clearly in letters and on the doorstep (low cost/difficulty). Additional disability awareness training for survey interviewers (medium cost/low difficulty).
- Additional reflections: Analysis of nonresponse patterns does not indicate that disabled people are systematically underrepresented in surveys, possibly because of the correlation between disability and age (older people are more likely to be disabled and to take part in surveys). However, it is ethically important to remove barriers, regardless of the impact on nonresponse. A more radical, higher cost, potential solution to disabled people’s concerns about whether survey interviewers would adapt to their needs would be to offer different modes of participation – this is discussed later.
Barriers and solutions: Opportunity
- ‘Lack of time’ dominates stated reasons for nonparticipation in research. Though this may sometimes mask underlying barriers, willingness to give up an extended period of time does appear to influence views on taking part in Scottish Government surveys, particularly (though not only) for younger people.
- Potential solutions: Test different options for shortening the surveys. For example: offering a ‘core’ face-to-face survey with option to continue to a full interview (either face-to-face or by another mode); allowing survey interviewers to offer a much shorter ‘core’ interview to avoid refusals; or offering completion by other modes from the outset.
- Cost/difficulty: High – would involve significant trade-offs in terms of scope, completeness, and potentially data quality.
- Additional reflections: Views among both stakeholders and the public on whether the Scottish Government surveys should be shortened were mixed, though there was a clear preference among some younger participants in particular for a much shorter survey, suggesting that it may be worth testing one of these options. However, this would need careful testing to understand the trade-offs. Given the debate over the degree to which time is an actual or a perceived barrier, one option might be to test the impact of offering a very short initial survey, in the hope that this is sufficiently engaging that respondents then agree to carry on and do a longer main interview.
- Additional potential solutions: Clearly communicate and stick to the expected maximum time the interviewer will stay for (low cost/likely low impact, though could lead to more partial completions). Offer a guaranteed ‘exemption’ from other Scottish Government surveys for a period of time after participation (likely low cost/impact, though the implications of the loss of sample frame for other Scottish Government research would need to be considered).
- Stakeholders and survey interviewers felt civic engagement had weakened over time and that this may be contributing to higher nonresponse. However, the evidence for this from the literature and general public qualitative research was quite limited (perhaps inevitably, since those who agree to participate in research are themselves likely to be more engaged). At the same time, there was evidence that social norms may be shaping participation if people think that Scottish Government surveys are not something ‘people like them’ take part in. There was a clear stereotype that survey participants tend to be older, more informed, and potentially more ‘used to’ being listened to.
- Potential solutions: Create content (either for use as part of a wider campaign around Scottish Government surveys and statistics, or for use by survey interviewers on the doorstep) challenging ‘social norms’ around a ‘typical respondent’.
- Cost/difficulty: Variable, depending on the level/type of content involved.
- Additional reflections: This could form part of a wider strategy to improve public perceptions and understanding of government surveys and statistics.
Barriers and solutions: Motivation
- Analysis of patterns of response on SHS and SHeS over time and feedback from survey interviewers indicate that highlighting the option to opt out prior to an interviewer calling as part of the contact details section of the advance letter increases office refusals.
- Potential solutions: Avoid/remove this wording from advance letters.
- Cost/difficulty: Low, though it is important to balance this against the requirement to convey the optional nature of participation.
- Additional reflections: It is possible that some of these potential respondents may have been persuaded to take part on the doorstep, but it is arguable that those who opt out ahead are less likely to take part in any case, so removing this wording may simply convert office refusals to doorstep refusals.
- The methodology literature highlights the importance of both confidence and experience in predicting how successful survey interviewers are in persuading potential respondents to take part. The Covid-19 pandemic was associated with increased challenges around interviewer recruitment and retention, leading to challenges in quickly equipping new survey interviewers with the skills and confidence required.
- Potential solutions: Revisit training, particularly for new survey interviewers, focusing on key elements identified in the literature as critical, including: how to engage respondents quickly; how to tailor approaches; how to identify whether behavioural ‘scripts’ have been triggered, and how to dispel them; how to maintain doorstep interactions; increasing confidence; and specific refusal conversion techniques.
- Cost/difficulty: Medium, depending on the level of additional training required and how this is delivered.
- Additional reflections: While it is important to revisit interviewer training regularly, the main fieldwork providers in Scotland already invest considerably in this, and training cannot completely mitigate challenges around loss of experienced survey interviewers.
- ‘Social aversion’, alongside shifting expectations around how information is provided in the ‘online age’, were both associated with discomfort with the idea of having a stranger in your home. This was particularly a barrier for younger participants who expressed uncertainty around how to behave when ‘hosting’ an interviewer. There was some evidence that this has increased both as a result of the Covid-19 pandemic, and the shift to many more interactions taking place online, which led some to regard face-to-face, in-home interviews as an unusual or even ‘creepy’ request.
- Potential solutions:
- A lower cost, more straightforward response would be to place more emphasis in messaging (in advance letters, on the doorstep, and more broadly in any public messaging around Scottish Government surveys) on explaining why they are conducted face-to-face, and what the benefit of this is (both for the respondent and the quality of the data).
- A more radical (and higher cost) solution would be to offer greater flexibility around how people complete the surveys. This could include proactively offering video calls, which participants suggested retained some of the benefits of face-to-face without needing to have a stranger in your ‘physical space’, and/or offering online or paper self-complete option(s) (which again, participants suggested as an alternative).
- Offering a greater degree of flexibility about where interviews take place – for example, offering to speak to respondents in a garden, garage, local coffee shop, or community space – might also help offset concerns about ‘hosting’ interviewers.
- Additional reflections: Providing clearer messaging on the reasons for face-to-face data collection may help address some participants’ concerns. However, if people are unwilling to host strangers in their homes, this approach has clear limits as a long-term solution. Offering other interview modes may have a bigger impact, but could also potentially lead to higher refusals (as it is easier to cancel video appointments than face-to-face ones, for example). There would also be other potential trade-offs (for example, reduced data quality and completeness and loss of time series data if some questions cannot easily be transitioned to another mode), so this would need to be tested carefully. Conducting interviews in less private venues would also need to be considered carefully, given the potentially sensitive content of some sections of the Scottish Government surveys.
- Survey interviewers reported that a lack of trust in both government and negative views of other public institutions, like the NHS and the criminal justice system, were increasingly cited on the doorstep as reasons for people not wanting to take part in the Scottish Government surveys. Survey data confirms that trust in government does appear to have reached a particular low point in the years since the pandemic (see for example National Centre for Social Research, 2024b, and the Ipsos Veracity index). However, members of the general public interviewed for this research placed more emphasis on whether government would act on the findings, rather than a lack of faith in government per se, as the reason they might not take part.
- Potential solutions: Place greater emphasis in survey communications on the survey being for Official Statistics and/or the research arm of the Scottish Government.
- Cost/difficulty: Low, as it primarily involves change of wording and/or logos (and could be easily tested via relatively inexpensive experiments with different logos, for example).
- Additional reflections: Qualitative interviews with the general public indicated that trust in government in general is less critical than engaging specifically with a lack of trust in how governments use survey data.
- Survey interviewers reported that the most common question they field on the doorstep is “will it make any difference?”. The general public expressed both a desire to feel they are contributing to improving things – for themselves, their community, their area, or society in general – and considerable scepticism about whether government surveys actually have any positive impact in practice.
- Potential solutions: Providing a wider range of examples and evidence of the uses of survey data in survey materials that speak to the interests of different groups. Rather than focusing only on examples of how findings shape policies (which may themselves be contested), an alternative or additional option would be to frame their usefulness in terms of holding governments to account. Building in a clearer ‘feedback loop’ so that respondents are informed how the findings from their year might also boost the sense of value around taking part.
- Cost/difficulty: Low – providing additional examples to interviewers should be inexpensive. Sharing findings with respondents would have a cost attached, but assuming this was done primarily by email this would again be relatively low.
- Additional reflections: Given the emphasis within the general public groups on understanding the impacts of taking part, this is an important area to develop.
- Assuring respondents about confidentiality and explaining how privacy is safeguarded has been shown to significantly increase willingness to take part in surveys (Couper et al, 2008, Singer and Couper, 2017). However, providing reassurance about data protection is becoming increasingly challenging, due to greater awareness and alertness to scams and fraud. Disabled participants and those from Black backgrounds also expressed concern about their data being misused by government or others in ways that might damage their communities.
- Potential solutions: Equipping survey interviewers with quick, tangible reassurances around data protection and misuse of data.
- Cost/difficulty: Low – again, providing interviewers with additional information should be low cost.
- ‘Survey overload’ due to the growing volume of survey and feedback requests from both public and commercial bodies, which creates a more challenging landscape. As a result, people find it more difficult to identify which requests are genuinely worth their time and effort.
- Potential solutions: Test ways of more clearly distinguishing Scottish Government surveys from ‘coffee cup’ surveys (a term used to refer to the feedback requests people receive after many everyday commercial transactions), including potentially changing their description from ‘survey’ to ‘important government study’.
- Cost/difficulty: Variable. Changing the description from ‘survey’ to ‘study’ would be inexpensive. However, if this were accompanied by an awareness raising campaign around the value of the Scottish Government surveys, this would be higher cost.
- Additional reflections: While it is important to consider how to respond to this barrier, any actions will be limited in their ability to affect the broader survey landscape, which is beyond Scottish Government control.
- The proliferation of commercial surveys may mean more people now expect a financial incentive or reward for taking part in surveys. General public participants in this research expressed mixed views on incentives – some were adamant they would not be willing to take part without an incentive, others felt the impact on society would be a greater motivation. There is strong evidence that incentives (particularly monetary ones) can boost response rates among underrepresented groups (e.g. see Mack et al, 1998; Singer & Kulka, 2002; McGonagle and Freedman, 2017).
- Potential solutions: Test the impact of incentives on response rates across the Scottish Government surveys. This could include testing both differential incentives (where incentives are targeted at those known to be less likely to respond – for example, those in deprived urban areas) and/or discretionary incentives, where survey interviewers are able to offer incentives to avoid a refusal or convert a previous refusal.
- Cost/difficulty: Medium-high, depending on the level of incentives, and how they are offered (targeted vs. universal).
- Additional reflections: Although survey interviewers express reservations about incentives, there is strong evidence from the literature that they can make a difference (see references cited above).
Conclusions
This research has systematically mapped potential barriers to taking part in surveys, drawn together the evidence for each, and developed practical potential solutions. In doing so, it aims to contribute to extending and deepening understandings of nonresponse, while recognising that it remains a complex issue and that there is no ‘silver bullet’ solution.
The solutions discussed fall into four main categories: incremental improvements to survey design and processes; improvements aimed at supporting and strengthening the fieldforce; more radical changes to survey design and mode; and efforts to improve the wider survey environment.
In deciding which solutions, or combinations of solutions, to take forward, the Scottish Government should consider:
- which may have the most impact on underrepresented groups (e.g. young people and those from minority ethnic backgrounds)
- which changes are also an ethical imperative (particularly those relating to accessibility), and
- the opportunities to develop solutions that individually or collectively address multiple barriers.
In assessing the impact of any solutions being tested, it will be important to review the degree to which they have succeeded in addressing the specific barriers they were intended to target, as well as their direct impacts on nonresponse levels, cost, data quality and completeness.
Contact
Email: surveystrategy@gov.scot