Information

Scottish Parliament election: 7 May. This site won't be routinely updated during the pre-election period.

Understanding survey nonresponse behaviours: evidence and practical solutions

This report summarises key findings from research to extend understanding of the challenges posed by nonresponse and nonresponse bias in the Scottish Government's general population surveys, and identifies potential solutions.


5. Conclusions

As discussed in the introduction to this report, survey methodologists have been concerned with combatting nonresponse for many decades. Interest has only intensified as response rates have fallen. However, discussion of the underlying drivers of nonresponse has often been hypothetical and speculative. Applying a behavioural lens to understanding nonresponse has enabled this report to systematically map potential barriers that prevent people from taking part in surveys. The report draws together data from multiple sources, including research with underrepresented groups themselves, to examine the evidence for each of the barriers identified. In doing so, it is hoped that it contributes to extending and deepening understandings of nonresponse, as well as providing practical, grounded suggestions for how the Scottish Government might reduce nonresponse in its general population surveys.

In considering which solutions to develop further, it is important to keep in mind that there is almost certainly no ‘silver bullet’ that might return nonresponse to the levels seen in previous decades. The impact of factors like changing patterns of home working and the proliferation of commercial surveys on nonresponse may be impossible to fully mitigate. It is also important to be aware that survey methodologists increasingly recommend considering other indicators of survey quality (such as demographic representativeness against benchmark figures) instead of, or alongside, response rates. It would be timely for the Scottish Government to reconsider how it reports on survey quality across the three general population surveys, particularly in the light of the recent Survey Futures statement on this topic (Maslovskaya et al, 2025).

However, none of this negates the importance of understanding the drivers, implications of, and potential solutions to nonresponse – rather, it highlights the need for a well-evidenced and nuanced discussion, which it is hoped this report contributes to.

Choosing and testing potential solutions

The potential solutions discussed in this report fall into four overarching categories. Each of these address nonresponse from a different angle, and bring different challenges in terms of ease of implementation, as well as potential impact.

  • Incremental improvements to survey design and processes. Solutions that fall into this category include those relating to redesigning advance letters, testing of logos and envelope designs, and refining interviewer messaging on the doorstep. These are generally reasonably low cost and straightforward to implement, at least individually. Their individual impacts on nonresponse may be modest, particularly given the considerable effort that has already been invested by commissioners and contractors in delivering the three surveys to a high standard. However, the cumulative impact of multiple incremental improvements may nonetheless be more substantial. Whenever possible, experiments should be used to test the impacts of individual changes, to identify the optimum combination to roll-out. There may be opportunities to combine efforts and test different interventions on each of the three surveys at the same time, to maximise learning.
  • Improvements aimed at supporting and strengthening the fieldforce. Given the critical role of survey interviewers in face-to-face fieldwork, effective training, particularly for new survey interviewers, is essential. Specific areas that should be addressed across the sector to help survey interviewers overcome the barriers identified in this report are outlined in previous chapters. Additionally, training should aim to boost survey interviewers' confidence and their belief in their ability to convince people to participate.
  • More radical changes to survey design and mode. While incremental improvements to survey materials and interviewer training may go some way to addressing specific respondent concerns, some of the societal shifts identified in this report may be more immune to incremental solutions. More radical changes – such as substantially shortening the survey, breaking it into different parts, offering different modes of completion, or offering differential or discretionary incentives – carry much higher costs and risks. However, they could also have a more significant impact in terms of encouraging groups who are reluctant or unable to take part in the surveys in their current form to participate. The trade-offs and risks associated with changing and mixing modes have been discussed in detail in a previous Scottish Government report (Ormston et al, 2024). This should be read alongside this one in considering which, if any, of these more radical changes to develop and test. It is particularly crucial that these more radical changes are only introduced following carefully designed and controlled experiments to test their impacts.
  • Efforts to improve the wider survey environment. A fourth group of potential solutions are focused less on persuading those sampled to take part, but rather aim to improve the ‘receptiveness’ of the public as a whole to Scottish Government surveys and statistics. Influencing the wider context in which surveys take place is arguably very difficult, and the impact on nonresponse uncertain. However, it may nonetheless be important to develop more consistent narratives and messaging around Scottish Government surveys and statistics to combat public distrust or indifference.

In determining which solutions to take forward, cost, complexity and trade-offs in terms of data quality will inevitably need to be weighed. Three additional, interrelated questions the Scottish Government may wish to consider are:

  • which potential solutions may have the most impact on underrepresented groups (e.g. young people and those from minority ethnic backgrounds)
  • which changes are also an ethical imperative (particularly those relating to accessibility), and
  • where there are opportunities to develop solutions that individually or collectively address multiple barriers.

We discuss each of these briefly in turn below.

Increasing participation among underrepresented groups

Analysis of differences in response rates to the three main Scottish Government general population surveys conducted for this research confirms that response rates remain lower (and, in some cases, have fallen more than average) among: younger people; people from minority ethnic backgrounds; and those with lower levels of education. These are all groups that are of particular interest to the Scottish Government, as they are among the groups known to experience more negative outcomes across a range of areas. Improving their response rates would not only reduce overall nonresponse bias, but also provide valuable additional data about their experiences.

This report highlights various solutions that could, potentially, help target barriers to survey participation among young people, people from minority ethnic backgrounds, and those with lower levels of education. For example:

  • Offering a shorter interview; providing alternative modes of completion; and offering incentives all found favour among young people interviewed for this research.
  • The literature also indicates that offering incentives tends to have a disproportionate impact on response among underrepresented groups, including young people, those from minority ethnic backgrounds and those from more deprived areas.
  • Interviewees from minority ethnic backgrounds (and disabled participants) expressed particular concerns about the potential for data to be misused, with negative impacts for their communities; solutions aimed at reassuring participants about data uses may be particularly important to encouraging their participation.
  • Challenging stereotypes of who takes part in surveys – and doing more to reassure respondents that they can take part – may be particularly important to both younger people and those with lower levels of education.

Enhancing accessibility

While disabled people are not necessarily systematically underrepresented in surveys (possibly because age and disability are strongly correlated, and older people are more likely to participate), increasing the accessibility of surveys is an ethical imperative. Interviews with disabled people for this research highlighted that accessibility concerns can arise across the survey process, from the advance letter to the experience of taking part in the interview itself.

Some of the barriers identified in this report may require actual changes, such as redesign of the advance letter. However, others are less about changing materials or approaches and more about ensuring that existing accessibility options (such as willingness to offer flexible appointments) are offered more clearly to potential respondents. Implementing these solutions would be low cost. Other potential solutions to concerns raised by disabled people may require more expensive changes, including offering different modes of participation.

Solutions that address multiple barriers

In addition to considering which potential solutions may have most impact for key groups of respondents, it will also be important to consider which will best address the range of different barriers identified in this report. Considering both these questions could help the Scottish Government determine how solutions can be ‘packaged’ together for maximum impact. For example,

  • Efforts to raise awareness of government surveys in general could focus on illustrating their value to different groups in society (including underrepresented or potentially excluded groups like young people, disabled people, and those from minority ethnic backgrounds). They could also challenge norms about who takes part in surveys, by having people with different characteristics talk about why they took part in surveys and what they got out of it. A campaign around these themes could potentially impact on underrepresented groups and on a range of barriers, including: lack of awareness; social norms around survey participation; ‘survey overload’ (by helping people to assess how government surveys might be different); and scepticism about the impact of surveys.
  • Some of the more ‘radical’ solutions, while more challenging to develop and implement, also have scope to impact across multiple barriers. For example, offering different modes for completing interviews, including self-complete, could help with concerns about time (by allowing people greater flexibility around when they take part), as well as ‘social aversion’ or other respondent concerns about having an interviewer in their home (including concerns among some disabled people about health risks).
  • At the same time, some of the (relatively) simpler actions could also respond to multiple barriers. For example, any changes to the advance materials and interviewer training should consider accessibility, communicating the value of the surveys more effectively, how to distinguish the survey from ‘spam’ or from commercial surveys, enhancing reassurances around data protection and misuse, etc.

Assessing the effectiveness of solutions

Once decisions have been made about which solutions to implement or test, it is important to assess whether they have made any difference in practice. This will, of course, require examining response rate data. However, it will also be important to consider any evidence that the changes have succeeded in addressing the specific barriers they were intended to resolve, or whether these barriers persist.

Another important consideration will be their impact on response among underrepresented and potentially excluded groups’, including the groups identified in this report. This links to the overarching point made at the start of this report – that response rates matter insofar as they drive nonresponse bias. At the heart of any assessment of the impact of solutions to nonresponse should be their impact not only on overall response rates, but whether and how they lead to a more representative sample.

Contact

Email: surveystrategy@gov.scot

Back to top