Understanding survey nonresponse behaviours: evidence and practical solutions
This report summarises key findings from research to extend understanding of the challenges posed by nonresponse and nonresponse bias in the Scottish Government's general population surveys, and identifies potential solutions.
4. Barriers and potential solutions: motivation
The ‘motivation’ element of the COM-B model relates to the internal processes influencing our decision-making, including ‘reflective’ and more ‘automatic’ processes. In the context of deciding to take part in a survey, this might include interest in the topic, beliefs about ease of completion, and feelings of reciprocity, as well as more automatic responses on the doorstep that may make people inclined to speak to an interviewer or not.
Much of the research literature seeking to understand why people do or do not take part in surveys has focused on barriers relating to motivation. This chapter discusses barriers relating to seven overarching themes: avoiding ‘easy’ refusals, ‘social aversion’ and shifting expectations, trust in government, scepticism about whether participation makes a positive difference, concerns about data misuse, “survey overload”, and expectation of reward.
Avoiding ‘easy’ refusals
Office refusals
Summary
Survey interviewers suggested that explicitly mentioning that people can ‘opt out’ by ringing the SHS helpline had increased refusals by making it easier to refuse before the interviewer can explain the survey to them. Analysis of patterns of office refusals over time support this view.
Both feedback from survey interviewers and analysis of patterns over time of office refusals to the SHS and SHeS indicate that small changes in wording to the advance letters can serve to increase people’s likelihood of refusing prior to survey interviewers having an opportunity to explain the survey to them.
Both the SHS and SHeS saw a rise in refusals direct to the office from 2012 to 2019 (from 1.9% to 4.3% and from 3.7% to 7.2%), with SHeS experiencing a particularly marked rise between 2017 and 2018. This coincided with a small but significant change in the wording of the SHeS advance letter, with the phrase “or don’t want to take part” added before the freephone number. This wording was subsequently removed in 2022, and office refusals fell back from 7.2% to 2.4% between 2019 and 2023.
Meanwhile, on the SHS, which in 2022 amended the heading of the ‘contact’ section from ‘How do I contact you?’ to ‘How do I arrange an interview, opt out, or find out more information?’, office refusals increased from 4.3% to 7.0% from 2019 to 2023. SHS interviewers were aware of this change to the advance letter and believed it had increased office refusals, including by enabling family members to ‘opt out’ participants (particularly older parents) without giving the intended respondent an opportunity to decide whether or not they wish to take part.
Potential solutions
The obvious ‘solution’ to this issue would be to remove wording that highlights how to ‘opt out’ from the advance letter. However, this needs to be considered alongside the importance of conveying the optional nature of survey participation. Previous research exploring why people decided to take part in both qualitative and quantitative government social research found that emphasising the voluntary nature of a study could be a positive facilitator of participation (Graham et al, 2007). This was echoed in comments from participants in this study:
"I think the opting out bit is reasonably attractive because it's, it's not saying ‘thou shalt be hounded to do this’. Yeah, there is a get out for you, even if you decide not to." (Public Group 4, aged 25+, rural areas)
Nonetheless, it might be more appropriate to emphasise that taking part is voluntary separately from highlighting the freephone number for seeking further information about the study or arranging appointments. Advance communications could also emphasise that interviewers can provide a better explanation than a letter of what is involved in taking part, to help potential respondents decide whether to participate. At the same time, the impact of removing this wording on overall refusals is likely to be fairly low. Reducing office refusals would provide survey interviewers with an opportunity to explain more about taking part, which may persuade some to participate. However, those who opt out in advance are arguably more likely than average to refuse on the doorstep too.
Interviewer skills and confidence
Summary
The survey methodology literature demonstrates the importance of interviewer confidence and experience in predicting whether they are successful in persuading people to take part. This was also emphasised by survey interviewers who took part in this study. However, interviewer turnover, particularly since the Covid-19 pandemic, creates challenges around how to quickly equip new interviewers with the right skills and confidence.
While advance letters can help ‘prime’ people to expect a visit from a survey interviewer, and potentially to receive this visit more openly than they otherwise would, the interaction with the interviewer on the doorstep is often crucial in determining whether a potential respondent goes on to take part in the Scottish Government surveys. As an SHS interviewer put it:
“I don’t think anyone does an interview for the Scottish Government, they do it for you. They’re talking to you."
Survey interviewers need to be able to respond to both ‘automatic’ and ‘reflective’ motivations that impact whether or not an individual will engage with them. Heuristic psychological theories of survey participation (e.g. see Groves and Couper 1998) posit that ‘automatic’ motivations are triggered when a potential respondent quickly attempts to interpret an interviewer’s intent and activates a corresponding ‘script’ in response. Because survey requests are rare, many people lack a dedicated script and instead default to a more cautious or dismissive "generalised stranger" script. This can lead to generic refusals like "I'm too busy" or "not interested". Survey interviewers need to be able to quickly recognise and dispel these automatic reactions, as well as to engage with more considered concerns about taking part once they have a potential respondent’s attention.
The survey methodology literature highlights the importance of both interviewer confidence and experience in predicting how successful survey interviewers are in persuading respondents to take part. For example, Jackle et al (n.d.) and Blom et al (2010) both found that survey interviewers who believed people could be convinced to take part generally achieved higher cooperation rates. Survey interviewer experience has also been shown to be associated with higher response rates, even after controlling for other factors, like area characteristics and survey type. Survey interviewers interviewed for this study felt that it can take considerable time to learn the skills required to be a good interviewer, and to “know which piece of the armoury to use to … persuade them (potential respondents).”
The Covid-19 pandemic was associated with increased challenges in the research industry around interviewer recruitment and retention, leading to losses in experienced survey interviewers in some parts of the research industry (Charman et al, 2024). This led to challenges around how to quickly equip new survey interviewers with the skills and confidence required to secure participation.
Potential solutions
The survey methodology literature identifies various elements that can help maximise survey interviewers’ likelihood of success in persuading people to take part, and which should be a key focus in training, particularly for new survey interviewers. These include:
- How to engage potential respondents quickly – within the first few seconds – through voice, tone and style.
- How to tailor approaches depending on the area, household and individual. Generic introductions and rigid scripts should be avoided, as they increase the chance of a ‘script’ being activated.
- How to identify whether a behavioural script has been triggered, and if so which one and how to dispel it. For instance, if the interviewer believes that a “salesperson” script has been activated, they must quickly dispel this (e.g. “I’m not trying to sell you anything”).
- How to maintain doorstep interactions, as longer interactions are associated with increased chances of eventual participation.
- Increasing interviewer confidence levels, given the evidence that higher confidence increases response.
- Refusal conversion techniques, including reassurances about privacy, time commitments, scheduling (including potentially reframing the first contact as a request for an appointment), and the impact of the research (which can be a crucial ‘reflective’ motivation, as discussed below).
‘Social aversion’ and shifting expectations
Summary
Stakeholders felt that people had become more ‘socially averse’ since the Covid-19 pandemic, impacting on their willingness to allow survey interviewers into their homes. This was supported by interviews with the general public, who highlighted both increased discomfort about having strangers in their home as a result of the pandemic and confusion about why surveys needed to involve face-to-face visits at all, when it is possible to provide most information online.
There has been much debate about the long-term impact of the Covid-19 pandemic on people’s willingness to engage with strangers. Increased ‘social aversion’ was suggested by stakeholders interviewed for this study as a potential factor in increased nonresponse. Similar views have been expressed in other recent research with survey interviews (see Charman et al, 2024, Mesplie-Cowan et al, 2024). Interviews with the general public for this study provided some support for this theory. Participants commented that they felt they had become less willing than they once were to let people they do not know into their home. For some, this was fundamentally linked with the experience of the Covid-19 pandemic, either because they had become unused to receiving strangers or because of ongoing health concerns about having people in their home.
“Covid has very much changed how I feel about my home. … I have got used to no one else coming into my flat. Really used to it. And quite anxious about anybody actually being inside my home.” (Public Group 3, aged 25+, urban areas)
Others linked it to wider safety and privacy concerns about letting strangers into their personal space.
A related recurrent theme was that people did not understand why the surveys needed to involve in-home interviews. In a context where most things can now be done online, face-to-face in-home interviews were seen as unusual or even “creepy”, as well as environmentally unsustainable:
“Why are they coming to my door? This feels like a bit of an antiquated system in the sense that that's a lot of petrol for somebody to be driving to somebody's house…I feel like there's much, much quicker, easier ways of communicating with people these days” (Public Interview 5, disabled person)
Participants (particularly younger participants) also commented that they did not know how they were meant to behave when ‘hosting’ an interviewer in their home:
“It could be awkward, I wouldn’t know how to react - do I have to offer them tea?” (Public Group 2, aged 16-24, rural areas)
These comments underline that the barrier may not simply be ‘social aversion’, but an increased unfamiliarity with having strangers inside your home, and concern about how to navigate this situation.
Potential solutions
These findings have clear implications for survey messaging. There may need to be greater emphasis in advance materials and interviewer-respondent interactions on explaining why some surveys are still conducted face-to-face. This could include highlighting the value of face-to-face interviews to both respondents (for example, in being able to ask questions about the process and receive support from the interviewer to help them complete it, if needed) and government (for example, higher quality data). It might also include emphasising that survey interviewers are there to do a job, to avert any social anxiety people might have about ‘hosting’ them.
Clearer messaging around the reasons for face-to-face data collection may help allay the concerns of some participants. However, if people are becoming less willing to host strangers in their homes, there may be limits to this as a long-term solution. Offering to complete the interview in another location – such as on the doorstep, in a garden, coffee shop, or community space – might go part way to allaying concerns about ‘hosting’ interviewers in a respondent’s private space. However, it would need to be carefully considered given the potential sensitivity of some questions in each of the Scottish Government surveys, which arguably require a private setting for completion.
A more radical solution would be to offer greater flexibility on how people complete the surveys. Participants’ spontaneous suggestions for making the surveys more appealing included offering different modes of completion, including a self-complete paper or online survey (although some participants noted that they struggle to fill in online forms and surveys), or completing it via video-call.
Overall, there was a preference for video-interviews across the general public interviews conducted for this study. Video interviews were felt to retain some of the benefits of a face-to-face encounter, but without the need to ‘host’ someone in your “physical space”. However, this preference may have been shaped by the fact that in most cases, people were participating in interviews for this study by video-call. Both telephone and self-completion modes (either online or on paper) would also address concerns about allowing interviewers into your home. Online interviews, in particular, would meet participants’ expectation that surveys, as with many other daily tasks, should be accessible online.
It is also worth noting that, to date, take-up of video interview options on social surveys has been fairly low, although there is ongoing work exploring whether video interviewing might become an established survey mode in the future. This focuses on respondent familiarity and communications, fieldwork logistics, interviewer training, technical barriers, and integration with existing survey infrastructures (Conrad et al, 2023).
As discussed previously, any change in mode needs to be considered carefully, taking into account not only the potential impacts on response but also the implications for data quality and completeness (see Ormston et al, 2024, for detailed discussion of the issues involved in changing modes). This is not necessarily a reason to dismiss such radical changes – indeed, it may be that in the future, the case for allowing additional flexibility over mode of completion continues to increase in line with people’s changing expectations around the nature of social interactions. However, ideally, any such changes should be introduced only after careful testing to understand both the impact on response (for example, there is a risk that it is easier to cancel video interviews than face-to-face appointments, so response could decrease if this is offered as an alternative) and on other elements of survey quality (like data completeness and accuracy).
Trust in government
Summary
Stakeholders and survey interviewers felt that lower trust in government was contributing to higher refusals to surveys. Survey evidence suggests that trust in politicians and parties, while never particularly high, has indeed fallen further in recent years. However, it was less clear from the general public qualitative research that general lack of trust in government was a key barrier. Rather, lack of faith that government would act on survey findings was a central concern.
Trust is generally viewed as a critical factor in survey participation, as it influences whether individuals perceive the survey process as legitimate and worthwhile. Past research has suggested that in societies where governmental or research institutions are highly trusted, individuals are more likely to view surveys as credible and aligned with public interest, thus increasing the likelihood of participation (Hofferth, 2001).
Stakeholders and survey interviewers interviewed for this study hypothesised that falling trust in, or negative perceptions of, government – either in general, or in the current party of government – might be a factor in rising nonresponse to Scottish Government surveys. Survey interviewers observed that the political landscape has been more polarised in the past ten years, citing events such as the 2014 Independence referendum, the 2016 Brexit referendum and the enforcement of lockdown rules between 2020-2021 due to the coronavirus pandemic as factors that may have impacted on potential respondents’ attitudes to taking part in things for government in general. In this context, it is worth noting that the most marked change in response rates to the three main general population Scottish Government surveys over the seven-year period from 2012 to 2019 was around 2013 to 2016, coinciding with the Scottish Independence and Brexit referendums (2014 and 2016 – see Appendix B). Survey interviewers reported that more people were expressing anti-establishment views on the doorstep as a reason for not taking part:
“Quite a lot of people are very vocal about not wanting to help the government. I don't know if that's just the areas I work in, but I'll get a lot of people say, ‘I know, I've read the letter, I'm not interested in helping this government’.” (SCJS interviewers group)
This extended beyond views of the Scottish Government to negative views of the NHS (for SHeS) and the criminal justice system (for SCJS).
Trends on public trust based on survey data are vulnerable to the very phenomenon they are trying to track – in other words, if the people becoming less likely to take part in surveys are those who are becoming less trusting, surveys themselves may be less able to pick up this decline. Such evidence that does exist for a general decline in trust is also somewhat mixed. For example, public confidence in official statistics actually increased slightly between 2014 and 2023, which would appear to go against a narrative of falling trust (National Centre for Social Research, 2024a).
However, there is evidence to support the view that wider trust in government has dropped in recent years. For example, the 2024 British Social Attitudes report found that a record high proportion of the British public ‘almost never’ trusted governments of any party to place the needs of the nation above the interests of their political party (National Centre for Social Research, 2024b). Similarly, the Ipsos Veracity index has found that, while there is no evidence that trust is declining across the board, politicians remain stranded at the bottom of the ‘trust’ tables, having reached a 40-year low in 2023. As such, declining trust in government could be a factor contributing to post-pandemic response rate challenges, particularly on government surveys.
Members of the general public who took part in this research expressed mixed views on the Scottish Government. However, trust in government per se was not necessarily seen as a barrier to taking part in surveys:
“I'd still give them an opinion and tell them I don't trust them” (Interview 8, Lower qualifications)
Rather, a lack of faith that government would act on the findings was cited as a key reason why they might not take part.
Potential solutions
One possible response to declining trust in government as a potential factor in nonresponse would be to place greater emphasis in communications – including the advance letter – on the survey being part of Official Statistics and/or the research arm of the Scottish Government, to try and ‘de-politicise’ it. As discussed in chapter 2, this could be informed by experiments to test which combinations of logos, if any, are associated with higher response rates among different groups.
At the same time, as noted, addressing trust in government in general may be less critical than engaging specifically with a lack of trust in how government uses survey data. This is discussed in the next section.
Scepticism about whether taking part makes a difference
Summary
Both survey interviewers and the general public highlighted the importance of being able to explain how government surveys will make a difference as a key motivator for taking part. However, the general public expressed considerable scepticism about whether they did, in practice, make any difference.
Interviewers for the SHS said that the most common question they fielded on the doorstep was “will it make any difference?” This was echoed in the general public interviews, with participants highlighting a desire to feel they are contributing to improving things – either for themselves, for society in general, for their own area, or for their community – as a key motivator for taking part.
“I think one thing that could improve people participating would be to start with explaining why the survey is being done and how the results will benefit the people that are doing the survey." (Public Group B, participants from Black backgrounds)
However, there was considerable scepticism about whether government surveys do, in practice, have any positive impact. Those who had previously participated in public sector surveys (such as those conducted by their local council), or in survey adjacent activities (such as consultations) reported that they had not seen any change as a result. There was cynicism about whether surveys were used to make it appear that government was listening when decisions had in fact already been made:
“I've started to fill them in and thought they already have made their mind up what they're going to do. So I just don't bother. I just feel it's a waste of my time.” (Public Group 3, aged 25+ urban areas)
Potential solutions
Providing a wider range of examples and evidence of the uses of survey data in survey and interviewer materials would better equip interviewers to address what appears to be a key concern for potential respondents: whether taking part will make a difference. Survey interviewers commented that it was not always easy to show what difference the Scottish Government surveys make and wanted more examples of uses that they could draw of and tailor to the characteristics and interests of individual respondents on the doorstep. Considering how to ‘sell’ the value of representing different groups in government surveys could form part of this.
However, participants also highlighted challenges around developing effective examples of impact. For example, while some members of the public were very positive about the examples of uses of SHS data shown in the survey leaflet, others commented that the examples were too ‘central belt centric’:
"I just thought, what does that have to do with anyone who lives in the country? Absolutely nothing" (Public Group 4, aged 25+, rural areas)
There were also different views on what impacts were ‘positive’, with some expressing disapproval of specific policies mentioned on the SHS leaflet, such as free bus travel for young people.
Survey interviewers and survey leads noted that placing too much emphasis on the scope for the surveys to directly impact council services (for the SHS) or the NHS (for SHeS) risked misunderstandings about the nature and content of the surveys. This could subsequently lead to frustrations for respondents if it led to their expecting the survey to incorporate a more immediate and direct feedback route on their experiences of services, rather than providing broader evidence to shape longer-term policy decisions.
These challenges do not negate the need to provide clearer and more varied examples of the uses of survey data, however. They simply highlight the need to approach this from the perspective of (different) potential respondents.
In addition to giving examples at the start as to how data has been used, an additional suggestion from the general public groups was to provide more feedback to respondents after they take part on what the results and impacts have been. This could involve, for example, sharing a short summary report with those who provide contact details to do so, once that years’ findings have been made public. Committing to a ‘feedback loop’ to demonstrate to participants how surveys are used may help boost the sense that taking part is a worthwhile way to spend their time.
It may also be worth framing the usefulness of surveys in terms of holding government to account, by collecting information that anyone (including other parties, journalists, think tanks, charities and even the public) can use to see what is happening in Scotland, rather than focusing on the impact on specific government policies (with which potential respondents may or may not agree).
Concerns about data misuse
Summary
The literature indicates that assuring respondents their data will be handled confidentially significantly increases willingness to take part in surveys. However, qualitative interviews with stakeholders, survey interviewers and the general public all indicate that providing this reassurance has become harder in the light of greater awareness and concerns about scams, fraud and data misuse.
The findings above suggest that respondents want to know more about the positive uses and impacts of their data. However, participants also require reassuring that their data will not be used in an irresponsible or negative manner. Assuring respondents their data will be handled with utmost confidentiality and explaining how privacy is safeguarded has been shown to significantly enhance willingness to participate (Couper et al, 2008, Singer and Couper, 2017). However, interviews with stakeholders, survey interviewers and the general public for the present study all indicate that providing this reassurance is becoming increasingly challenging.
There was a general perception across interviews that people are more aware of examples of data scams, fraud and data misuse in general. Interviewers across all three surveys noted that some people would not take part because of fears of scams. Although they carry ID cards, survey interviewers found these were not always sufficient to reassure a potential respondent.
Data from the SCJS highlights that a significant proportion of the population (one in ten) had experienced at least one type of cyber fraud or computer misuse in 2023/24 (Scottish Government, 2025), while the number of police recorded cases of fraud more than doubled between 2014/15 and 2023/24 (Scottish Government, 2025). In this context, it is unsurprising that survey interviewers are finding people are more concerned about fraud than they once were.
Moreover, the general public interviews highlight that even when potential respondents are convinced that a survey is a genuine government survey and not a scam, they may still have concerns about how the government will use their data. Disabled participants and those from Black backgrounds in particular expressed concerns about the potential for their data to be misused, linking this with examples they had read or heard about of misuses of data (not limited to survey data) collected from their communities:
“History hasn't been the best friend of collecting information of minorities in any capacity… Yes, it can be used positively, but the reality is that that information can be weaponised.” (Public Group A, participants from Black backgrounds)
“During the start of Covid we found out that autistic people, for example, were being added to DNR lists without our knowledge or consent... And obviously I recognise that surveys are anonymised and that sort of thing. I get that. But disabled people are very used to having our disabilities weaponised against us.” (Public Interview 3, neurodivergent person)
Potential solutions
Ensuring that survey interviewers are equipped with quick, tangible reassurances to concerns around data protection and misuse of data is key to addressing this barrier. The literature suggests that one effective strategy here is to actively demonstrate the security measures in place for data protection. This could include describing how data is stored securely, who has access to it, and how long it will be retained. Offering respondents the option to review and consent to specific data handling practices can also foster a sense of control and security (Novak, 2014).
“Survey overload”
Summary
The literature and qualitative research all indicate that capturing the public’s attention and convincing them government surveys are worth doing has become more difficult because people receive so many more survey and feedback requests than was the case in the past.
The notion of “survey overload” was mentioned across the literature and different interviewee groups included in this research. Survey interviewers reported increasing challenges in distinguishing Scottish Government surveys from what they referred to as “coffee cup surveys” (a term used to refer to the feedback requests people receive after many everyday commercial transactions). The general public clearly recognised this too, commenting on the cognitive challenge the volume of survey requests they now experience creates for them in working out which, if any, are actually worth their time and effort:
"you reach a kind of saturation point or kind of survey fatigue where … you can't distinguish between what's important and what's not important. And just because someone tells you it's important doesn't mean it is. So if you’re constantly overwhelmed with surveys, you lose the ability, I think, to discriminate." (Public Group 3, aged 25+, urban areas)
Potential solutions
Addressing ‘survey overload’ is difficult, since the Scottish Government has no control over the volume or nature of other surveys, or survey adjacent requests that the Scottish public receive. One response would be to develop and test possible ways of more clearly distinguishing Scottish Government surveys from other surveys. For example, changing the description in participant communications from ‘survey’ to ‘an important government study’ or similar could help draw a clearer distinction between Scottish Government and commercial surveys. However, this proposal received mixed feedback in the general public interviews, with some suggesting the alternative wording was too vague.
Expectation of reward
Summary
There were mixed views among survey interviewers and the general public on whether people expect to be financially rewarded for taking part in surveys. However, there is strong evidence from the literature that incentives can boost response among underrepresented groups.
Another consequence of the proliferation of surveys and feedback requests in contemporary society is that people are more aware that many of these requests come with an offer of immediate financial reward, such as a voucher or a free coffee. While SHeS offers £10 High Street Vouchers to participants, neither the SHS or SCJS currently use incentives.
General public participants in this research expressed mixed views on whether they would expect to receive an incentive and how critical incentives were to motivating people to take part in government surveys specifically. Some participants were adamant that they would not be willing to take part in any surveys without a financial incentive; others felt the motivation for taking part in government surveys was primarily the potential impact on society.
"They want my knowledge, they want my information. They've got to give me something in return.” (Public Group 4, aged 25+, rural areas)
“In the letter that they explain what it's for, why they're doing it, what the benefits will be, what they're going to do with it. This is an incentive in itself really.” (Public Group 4, aged 25+, rural areas)
There was a belief that incentives could attract different kinds of people from those who would answer otherwise, leading to a more diverse range of opinions being included. This is backed up by many experiments with incentives on social surveys, which demonstrate that incentives (and especially monetary incentives) can boost response among underrepresented groups, including younger people, those from deprived areas, and those from minority ethnic backgrounds (e.g. see Mack et al, 1998; Singer & Kulka, 2002; McGonagle and Freedman, 2017).
Potential solutions
As indicated, incentives are a proven means of increasing response rates. However, introducing universal incentives across the Scottish Government surveys would be very expensive. Differential or discretionary incentives, which aim to target incentives at low responding groups or use them to avert a hard refusal, may be a more cost-effective way of using incentives by ensuring resources are targeted on those less likely to respond. In doing so, they may also be more effective in reducing nonresponse bias, which as discussed is arguably more important than the overall response rate.
However, survey interviewers who took part in this research noted risks around discretionary or differential incentives, suggesting that they could backfire and have a negative overall impact on response if respondents who are not offered an incentive hear about those who did. This was felt to be a particular risk in rural areas, where people are more likely to talk to their neighbours. Both survey interviewers and the general public raised concerns that introducing incentives to government surveys specifically could risk delegitimising them or that it would be seen as a waste of taxpayers’ money:
“In a way (not having incentives) adds to the legitimacy, that this is important research, this is government stuff, we’re conscious of taxpayers money.” (SHS interviewers group)
Contact
Email: surveystrategy@gov.scot