Understanding survey nonresponse behaviours: evidence and practical solutions
This report summarises key findings from research to extend understanding of the challenges posed by nonresponse and nonresponse bias in the Scottish Government's general population surveys, and identifies potential solutions.
1. Background and approach
Aims of this research
This report sets out findings from research conducted by Ipsos, with advice from Professor Peter Lynn, exploring the challenge of nonresponse on the Scottish Government’s three main general population, face-to-face, cross-sectional Scottish Government surveys: the Scottish Household Survey (SHS), Scottish Health Survey (SHeS) and Scottish Crime and Justice Survey (SCJS).[1] This research was commissioned by the Scottish Government, with funding support from the Office for National Statistics (ONS).
In the context of this study, ‘nonresponse’ was focused on whether or not people participate in a survey at all (‘unit nonresponse’), rather than whether they decline to answer particular questions (‘item nonresponse’) or drop-out early (‘partial response’).
The research had the twin aims of:
- producing robust research to extend understanding of the challenge posed by nonresponse on the three surveys, and
- producing actionable and testable potential solutions to these challenges.
The challenge of nonresponse
The challenge posed by nonresponse is neither new, nor Scotland-specific. Response rates to face-to-face surveys have been falling by some accounts for well over half a century, as evidenced by fieldwork outcomes to the US National Election Studies since 1952 (Steeh, 1981). A recent study of trends in survey outcome rates across four major cross-national surveys in Europe - the European Quality of Life Survey, the European Social Survey, the European Values Study, and the International Social Survey Programme - also found a significant and consistent decline in survey outcome rates over the first two decades of the 21st century (Jabkowski and Cichocki, 2024). It suggested that trends were unaffected by country-specific characteristics or survey methodologies.
Nonresponse and the Scottish Government general population surveys
The last iteration of the Scottish Government’s Long-term Survey Strategy (2018), which sets the overall vision and plans for its population surveys, described the challenge of falling response rates for the Scottish Government’s surveys:
“The people in Scotland who participate in population surveys do so voluntarily – sacrificing a sometimes significant amount of their time […]. Unfortunately, our face-to-face collections have been experiencing a gradual, downward trend in response rates. A central risk is that this will result in an increasing impact of nonresponse bias in the results we report.”
All three of the major Scottish Government cross-sectional general population surveys have primarily been based on face-to-face approaches since their inception. Face-to-face approaches have tended to achieve the highest response rates in comparison with other modes. However, analysis of response rates for this research shows that between 2012 and 2019, there was a steady decline in the headline response rate across all three surveys, although the gradient of that decline varied between surveys:
- SHS: From 67.2% in 2012 to 62.4% in 2019, a drop of 4.8pp.
- SCJS: From 67.7% in 2012 to 58.7% in 2019, a drop of 9.0pp.
- SHeS: From 65.8% in 2012 to 55.6% in 2019, a drop 10.2pp.
Comparisons of response rates before and after the Covid-19 pandemic are complicated by changes in survey design post-pandemic. However, response rates in 2022 and 2023 (the latest years for which data are available) were lower than pre-pandemic, even when changes in design are taken into account in analysis.
- SHS: 43.9% in 2022 and 46.2% in 2023.
- SHeS: 37.3% in 2022 and 41.5% in 2023
- SCJS. 47.3% in 2022 and 46% in 2023.
The decline in the response rate to all three surveys over time has almost entirely been driven by an increase in the proportion of people who refuse to take part. The rate of non-contact (where it is not possible to make contact with an eligible respondent at an address issued to a survey interviewer) and other nonresponse (including, for example, being unable to take part due to ill health) has been broadly stable over the last 20 years. Appendix B of this report includes more detailed analysis of patterns of nonresponse over time on the three surveys.
Response rates and nonresponse bias
Response rates have typically been used as a key indicator of survey quality and a measure of potential bias in the data. However, as discussed in earlier research for the Scottish Government (Ormston et al, 2024), overall, research indicates a weak relationship between response rates and nonresponse bias (see for example Groves and Peytcheva, 2008; Hendra and Hill, 2019; Hutcheson et al., 2020; Micklewright et al., 2012; Williams and Holcekova, 2015). A recent statement from the UK’s leading survey experts, under the Survey Futures collaboration, argues that:
‘Surveys with low response rates can still produce high-quality data, and the reverse can also be true.’ (Maslovskaya et al, 2025)
High nonresponse bias occurs if those who do and do not respond to surveys differ in ways that makes a difference to the outcomes the survey is trying to measure – in other words, if response rates are highly variable between different groups, this is likely to reduce how well it reflects the target population. While a higher response rate can reduce the risk of such variability, it does not necessarily do so; nor does a low response rate necessarily increase it.
Particular demographic groups have traditionally been under-represented in face-to-face surveys. Analysis of response rates on the three Scottish Government surveys highlights lower response rates among: people living in deprived areas of Scotland; those in large urban areas; younger people; people from minority ethnic backgrounds; those with lower levels of education; and private renters. There is evidence that some (though not all) of these groups have become even less well represented in Scottish Government surveys over time. For example:
- The age gap in terms of response has worsened over time, with an increase in underrepresentation of the 25-34 group and overrepresentation of the 75+ group.
- Broadly, the surveys became more representative in terms of the educational attainment of respondents between 2012 and 2019. However, the gap widened between 2019 and 2022.
- In terms of tenure, ‘other renters’, which includes private renters, became increasingly underrepresented between 2017 and 2022.
Patterns by ethnicity, deprivation, and rurality were more complex, however. For example, those from minority ethnic backgrounds became more underrepresented from 2012 to 2019, but the gap in response between White and minority ethnic respondents narrowed between 2019 and 2022.
In the context of this research, it was important to consider not just what might be driving overall falling response rates, but what might be driving nonresponse among those groups least well represented in surveys. Boosting response among these specific groups would likely have the greatest impact in reducing nonresponse bias.
Applying a behavioural lens to survey nonresponse
Considerable efforts have been expended in recent decades by survey methodologists across the world on measuring, understanding, and attempting to address patterns of nonresponse in face-to-face surveys. However, the interaction of multiple potential factors in driving nonresponse, and the difficulty of isolating or assessing the influence of these factors, led the survey methodologist Michael Brick to state that:
“Even after decades of research on nonresponse we remain woefully ignorant of the causes of nonresponse at a profound level.” (Brick, 2013).
While recognising these limitations, the present research seeks to improve our understanding of nonresponse by examining its causes and potential solutions through a behavioural science lens. It has a particular focus on the three main general population Scottish Government surveys, though it is expected that the findings will also be useful for other surveys. Behavioural science applies theories and techniques from a range of academic disciplines to better understand why people display particular behaviours and what interventions might be effective in changing them. This research used the ‘COM-B’ model of behaviour change (Michie et al, 2011) to structure analysis of barriers to taking part in surveys, and to identify potential actions that might mitigate these.
COM-B identifies three key components underpinning behaviour:
- Capability: having the knowledge, skills and ability to engage in a behaviour. In this context, this relates to the ability of people to understand and complete the survey.
- Opportunity: the physical and social external factors that make the behaviour possible. In relation to surveys, this might include the setting, time available, or social dynamics.
- Motivation: the internal processes influencing decisions. This might include interest in the survey topic, ease of completion (perceived and actual), feelings of reciprocity, etc.
Summary of methods
The research was conducted between January and June 2025 and included five main components:
- A desk-based literature review, focused primarily on evidence on individual-level factors that might influence nonresponse, including those which might lend themselves to behavioural interventions.
- Analysis of response rates to the three main general population Scottish Government surveys, covering changing patterns of nonresponse over time, patterns of response by geographic factors, and comparisons of unweighted sample profiles with census data, where possible, in order to assess nonresponse bias.
- Qualitative research with professional survey stakeholders, comprising three online focus groups with experienced survey interviewers working on each of the three Scottish Government general population surveys and one group with commissioners and contractors.
- Qualitative research with the general public. The researchers spoke to 43 people in total, across a combination of online and face-to-face focus groups, paired and individual interviews. There was a particular focus on those less likely to take part in surveys and/or those who may face particular barriers to participation. On this basis, those interviewed included a mix of age groups and people from urban and rural areas of Scotland, but also included proportionately more young people, disabled people, and people from South Asian and Black backgrounds.[2] Participants were also asked to speak to friends and family about what they thought about taking part in surveys and what the barriers would be, so that they could share wider perspectives on this within the group discussions.
- A key stakeholder workshop was held to share preliminary findings and potential actions with key stakeholders from the Scottish Government surveys and commissioners of other related UK surveys. The purpose was to gather feedback on the plausibility and feasibility of the preliminary findings and solutions ahead of finalising this report.
Further detail on the methods used are included in the Appendices to this report.
Scope and limitations
This research had a specific remit to explore behavioural factors that might impact on nonresponse. Behavioural models recognise the importance of context, including the ways in which the social and economic context can impact individuals’ opportunities or motivations to engage in a particular behaviour. However, in focusing on what drives individual respondents to take part (or not) in surveys, it must be acknowledged that this research has less to say on wider contextual factors, such as recruitment and retention challenges in the survey industry. These workforce challenges have been particularly acutely felt by some survey organisations since the pandemic, when large numbers of survey interviewers left the fieldforce as face-to-face surveys were put on hold. Workforce issues can impact on response - where there are shortages of survey interviewers, sample may not be worked, or worked fully (that is, survey interviewers may not make the required number of calls at each address). Workforce issues can thus have a significant impact on nonresponse, but they were largely outwith the scope of this project.[3].
A key element of this research was the qualitative research with the public. As with any qualitative research, the aim was to capture a range of perspectives from groups of interest, rather than to obtain a statistically representative sample. However, it is important to acknowledge that the relatively small numbers of participants within each subgroup means we cannot be certain that other viewpoints would not have been captured if more people had been interviewed. This particularly applies to disabled people, given the diversity of barriers they may experience depending on both their specific disability and wider circumstances. It also applies to those with intersecting characteristics (for example, disabled people from minority ethnic backgrounds).
While the research was able to recruit a diverse sample in terms of disability, ethnicity, urbanity/rurality, and age, it was more challenging to recruit people with lower levels of educational qualifications. In the end, we were only able to recruit six participants with no or lower qualifications (up to Standard Grade or equivalent). As education is known to be a factor in survey participation – and there is some evidence that the gap in response rates between those with higher and lower educational qualifications has increased on the Scottish surveys (see Appendix B) – this must be acknowledged as a limitation.
A key limitation of this study is the inherent challenge of engaging genuine survey nonresponders in qualitative research. While qualitative research typically differs from surveys in important ways (e.g. recruitment methods, nature of the interview and incentives offered), it is likely that individuals who are reluctant to complete surveys are also less inclined to take part in any form of research, including qualitative interviews. Although recruitment methods were designed to target those less likely to respond to surveys, and potential participants who indicated they would be “very likely” to take part in a government survey were screened out at the recruitment stage[4], it remains probable that the final sample was skewed towards individuals who are more positively disposed towards research in general, rather than those who are most resistant to participation. Interviews were also reliant on people’s own accounts of how they would respond to being asked to take part in a survey. While the use of a behavioural framework helped ensure we probed on a wide range of potential drivers, it must be acknowledged that people are not always able to accurately identify or articulate the main elements that shape their own behaviour. This makes weighing the relative importance of different factors that may impact on nonresponse particularly challenging.
Report structure and conventions
The main body of this report is intended to provide a relatively succinct account of the key findings from this research. More detailed findings from the first four individual elements of the research – the literature review, analysis of nonresponse data on Scottish Government surveys, stakeholder qualitative research, and general public qualitative research – are included in Appendices A to D.
The remainder of the main body of this report is structured around the COM-B model, with a chapter on each element (Capability, Opportunity, and Motivation). Each chapter sets out potential barriers to taking part in surveys relating to that element of COM-B. They draw on the research conducted for this study to summarise the evidence for each barrier, before discussing potential solutions.
The concluding chapter reflects on the overall findings, including which potential solutions may be more or less straightforward to implement and what their impact might be.
Contact
Email: surveystrategy@gov.scot