Survey nonresponse research: appendices

Appendices to the Understanding Survey Nonresponse Behaviours main report, providing detailed information on each element of the research, including the literature review, analysis of nonresponse data, and qualitative research with interviewers, survey stakeholders, and the general public.


Appendix A: Literature review

Summary of key points

  • Face-to-face survey response rates have been declining for decades. This trend has been observed in both Europe and the US.
  • Many factors have been hypothesised to account for rising nonresponse, but a lack of experimental data means that it is highly challenging to isolate or measure their individual contributions.
  • Several conceptual frameworks are available to help structure understandings of nonresponse, including the Total Survey Error (TSE) framework, Peter Lynn's framework for identifying and addressing risks of exclusion from social surveys, the Conceptual Framework of Survey Completion, and cost-benefit and heuristic psychological theories of survey participation. These frameworks all offer slightly different perspectives on the key factors influencing survey participation.
  • Common reasons given for refusing surveys include being too busy, lack of interest, and privacy concerns. However, these stated reasons may not always reflect people’s true motives.
  • The increased frequency of survey requests, both governmental and commercial, is believed to have led to an increase in “survey fatigue”.
  • Declining civic engagement has been linked to falling response rates. However, it is difficult to verify whether or not civic engagement is actually falling over time, since the main source of evidence for this is survey data. There is also some evidence to suggest that structural and economic pressures may be more influential in explaining declining participation in ‘civic activities’, rather than a general decline in prosocial values.
  • Trust plays a crucial role in survey participation. There is some evidence of declining trust in government, particularly in recent years, which could be contributing to response rate challenges.
  • Perceptions of unsafety, often influenced by local conditions, may lead to increased non-contact and refusal rates in face-to-face surveys. While overall crime rates have not consistently risen, specific groups remain more vulnerable and concerned about crime, potentially affecting their survey participation.
  • Socially integrated individuals, particularly those with higher levels of civic engagement, are more likely to respond to surveys. Individuals whose social networks are less diverse may be more reluctant to participate.
  • Response rates vary by household and individual characteristics. These include: household size (larger households are easier to contact) and composition; working status (and whether household members work from home); tenure (home ownership correlates with higher response); access issues (those living in flats are less likely to respond); urbanity (urban residents are less likely to take part than those in rural areas); sex (men tend to be less likely than women to respond); age (younger adults are less likely to respond); ethnicity (those from ethnic minority groups tend, overall, to have lower contact rates); education (higher education correlates with higher response rates), and personality traits (socially integrated and gregarious individuals are more likely to take part). Although disability is not systematically associated with nonresponse, there is evidence to show that disabled people experience accessibility issues which can prevent them taking part in surveys.
  • Individual attitudes towards the survey topic and the opportunity to express opinions also play a role. Engaging topics and the belief that feedback can make a difference can increase participation.
  • Interviewer characteristics, such as experience, personality, and confidence can significantly impact cooperation rates. Effective interviewer training, tailoring approaches, and addressing privacy concerns are crucial for maximising participation.
  • The initial interaction between the interviewer and householder is critical. Building rapport, tailoring introductions, and addressing potential "scripts" that lead to refusal are essential for securing cooperation.
  • Leveraging reciprocity by offering incentives and/or framing surveys as opportunities to contribute to society can both improve response rates. Monetary incentives are generally more effective than non-monetary incentives, particularly for those groups that tend to be under-represented. Non-monetary incentives can increase nonresponse bias.
  • Assuring respondents about data confidentiality and security measures can enhance participation, especially in sensitive surveys. Clear communication about data handling practices is essential.

Background and context

Nonresponse in probability surveys has been a key concern of methodologists and researchers since the early 20th century. Following foundational work by statisticians like Bowley (1913) and Neyman (1934), and the flawed 1936 Literary Digest poll (see Squire, 1988), researchers began focusing on how nonresponse biases survey results and how to mitigate against it.

Response rates, particularly for face-to-face surveys, have been declining for decades. This downward trend persisted in both Europe and the US into the 21st century, largely uninterrupted until the COVID-19 pandemic altered the design of many large-scale surveys, making historical comparisons difficult. A recent study of four major European cross-national surveys confirmed a significant, consistent fall in response rates over the first 20 years of this century (Jabkowski and Cichocki, 2024).

Many factors have been hypothesised to account for rising nonresponse. However, the interplay of these factors and the lack of experimental data makes it highly challenging to isolate their individual contributions to nonresponse. This has led the survey methodologist Michael Brick (2013) to state that “Even after decades of research on nonresponse we remain woefully ignorant of the causes of nonresponse at a profound level.”

Despite these challenges, this literature review considers what is known about nonresponse in probability sample surveys. It is intended as a summary of the key themes, rather than as a comprehensive write-up of all the nuance in the nonresponse literature. The main purpose of the literature review in the context of this study was to succinctly identify the key issues in order to inform both potential behavioural solutions and the content of the general public topic guides used in the later stages of the research.

Conceptual frameworks

A range of conceptual frameworks are valuable in orienting discussions of survey nonresponse. From the broadest to the most specific, these include:

  • The Total Survey Error (TSE) framework. Widely used by survey methodologists to evaluate survey quality by categorizing errors into two strands: measurement (the accuracy of captured responses) and representation (how well the achieved sample reflects the population). Nonresponse relates to the latter strand. If nonresponse is completely random – that is, unrelated to the survey variables of interest – it reduces the achieved sample size and lowers the precision of survey estimates but does not bias the results. However, if the likelihood of responding is systematically related to the variables being measured, then nonresponse can lead to nonresponse bias, which is far more concerning from a data quality perspective. The relationship between nonresponse and bias is not straightforward. The presence and magnitude of nonresponse bias can vary considerably across survey items, depending on how strongly response propensity is associated with the variable being measured. As Groves and Peytcheva (2008) demonstrated, a low response rate does not necessarily imply high bias, nor does a high response rate guarantee low bias.
  • Peter Lynn’s Framework for Identifying and Addressing the Risks of Exclusion from Social Surveys. Lynn (2024) builds on the TSE by examining barriers to representation at each stage of the representation strand and identifying at-risk subgroups. Lynn categorises nonresponse into three broad groups: noncontacts (where no interaction with the selected individual is achieved), unable to participate (due to cognitive, linguistic, or physical barriers), and unwilling to participate (where an individual actively refuses or passively declines). These categories map broadly onto standard survey outcome codes used in face-to-face fieldwork, with refusals typically forming the largest proportion of nonresponse in UK surveys. Rather than quantifying under-representation Lynn focuses on structuring the problem: defining which subgroups may be missing and why, and suggesting mitigation strategies to minimise their risk of exclusion.
  • The Conceptual Framework of Survey Completion (Groves and Couper, 1998). This splits factors affecting nonresponse into those beyond the researchers’ control (e.g. the social environment, economic conditions, and characteristics of the household) and those within it (e.g. the survey topic, the survey mode, and the choice of interviewers). These strands culminate in the interaction between the interviewer and the householder which results in an interview or in nonresponse. This framework has informed the subheadings used to organise the literature in the remainder of this appendix.
  • Cost-benefit psychological theories of survey participation, which view individuals as rational actors who engage in cost-benefit analyses when deciding whether to participate in a survey. These include Social Exchange Theory (Dillman, 1978 and Dillman et al, 2014) and Leverage-Salience Theory (Groves, Singer, and Corning, 2000). Social Exchange Theory suggests that individuals are more likely to take part if the perceived rewards – such as altruism, incentives, or personal relevance – outweigh the perceived costs, like time, effort, or privacy concerns. However, this model assumes uniform and deliberate decision-making, which may not reflect the emotional or habitual nature of many refusals. Leverage-Salience Theory builds on this by recognising that different individuals assign different levels of importance – or "leverage" – to different survey attributes. It argues that participation depends on which features are made salient to each individual and that individuals vary in how deeply they process survey requests. This speaks to the need for tailored approaches that target specific motivators across different groups in the population.
  • Heuristic psychological theories of survey participation, which view individuals as acting based on pre-learned response patterns or cognitive shortcuts to save effort. These include Script Theory (Abelson, 1981) and Judgements under Uncertainty (Tversky and Kahneman, 1974). For instance, Groves and Couper (1998) argue that when an interviewer appears at the door the potential respondent quickly attempts to interpret the interviewer’s intent and activates a corresponding “script”, such as that for a salesperson or stranger, and that this can strongly shape the outcome of the interaction. If the wrong script is triggered, this may predispose the individual to refuse participation. Because survey requests are rare, many people lack a dedicated script and instead default to a more cautious or dismissive "generalised stranger" script. This can lead to generic refusals like "I'm too busy" or "not interested".

Some researchers have proposed frameworks to reconcile the cost-benefit and heuristic theories of survey participation (e.g. Goyder et al. 2006; Groves, Cialdini, and Couper, 1992). For instance, Goyder favours a two-dimensional framework that covers the depth of decision-making, and the strength of cultural influences on participation. Groves, Cialdini and Couper write that a “full understanding of decisions to participate in a survey requires a theory that integrates the observed influences of sociodemographic and survey design factors, on the one hand, with the less observable impact of the psychological components of the relatively brief interactions between interviewer and respondent, on the other.”

Social environment

The social environment includes background factors such as the economic conditions of the country, the characteristics of the local neighbourhood, and the ‘survey-taking climate’ which can be thought of as encompassing both the public's general willingness to cooperate in surveys and the societal and organisational factors that influence this willingness (Loosveldt and Joye, 2016). Although the social environment is outside of researcher’s control, it is important to understand, as it speaks to the types of survey designs, approaches, and interventions that might be effective in reducing nonresponse.

What reasons do people give for refusing to take part in surveys?

Some research has asked potential respondents their reasons for refusing to take part in face-to-face probability surveys. While these reasons have survey specific elements, their patterns might give an insight into the survey-taking climate.

Brick and Williams (2013) compared reasons for refusal between two large face-to-face surveys in the US conducted 30 years apart (in 1978, and 2008), and found striking similarities, with reasons relating to disinterest and lack of time dominating.

Reasons for early nonresponse in Scotland’s 2022 Census were also explored as part of doorstep data collection to boost response. "Too busy" (35%) topped the reasons for noncompletion, followed by lack of awareness (17%) and not realising it was mandatory (14%). Age differences showed younger people to be less aware with older people citing practical barriers (e.g. needing help or paper forms).

Smith (1984) categorised reasons for nonresponse into situational (e.g. no time, busy, unwell) and ‘permanent’ reasons (e.g., dislike surveys, privacy concerns). Researchers have noted that nonresponse arising from ‘permanent’ reasons are more harmful as they are less random and so more likely to be biasing (Stoop et al., 2010).

However, while they provide valuable insights, stated reasons may not always reflect respondents’ true motives (Loosveldt and Joye, 2016). Other research has asked interviewers about their perceptions of the reasons for nonresponse, including the present research (see Appendix D). While interviewer assessments are also imperfect (they cannot always be certain exactly why someone refuses), the patterns they observe in how respondents react on the doorstep is illuminating. In 2023, ONS identified nine barriers via internal qualitative research with interviewers, including trust (e.g., government distrust), time (e.g., "time-poor" groups), language/cultural issues, health/disability, and physical barriers. These highlight systemic issues and are suggestive of potential interventions to improve response rates.

Recent research as part of the Survey Futures initiative has explored the post-pandemic role of face-to-face interviewers (Charman et al, 2024 and Mesplie-Cowan et al, 2024). Based on qualitative interviews with those responsible for managing face-to-face fieldwork operations, the report identifies a range of perceived reasons for increasing levels of nonresponse. These include higher perceived levels of social aversion and reluctance to engage with strangers – attributed in part to a residual effect of the COVID-19 pandemic – as well as growing public concerns about data privacy and GDPR. Declining trust in government, more varied working patterns due to hybrid work, and the perception that survey participation is time-consuming or inconvenient were also cited. The shift to mixed-mode designs was felt to have concentrated face-to-face efforts on harder-to-engage individuals, making face-to-face refusals more common. These accounts reinforce the idea that nonresponse is shaped by both broader societal trends and features of survey design, and underline the increasingly complex and demanding role of the field interviewer.

Some researchers have also investigated reasons why individuals do co-operate with survey requests. For instance, Lavrakas and Kocar (2023) analysed over 1,500 open-ended responses from members of the Life in Australia panel, who were asked why they joined and stayed in the panel. The most frequently reported motivations were wanting one’s voice to be heard, and a desire to influence change or share opinions. These were followed by wishing to represent others like them or minority groups, intellectual stimulation or thought provoking content, feeling their input is valued, having an interest in the survey topic, receiving incentives, and contributing to research or science. Younger adults were more likely to mention the receipt of incentives, men were more likely to mention a desire to influence change, and women were more likely to mention thought-provoking content.

Over-surveying

Over-surveying, sometimes referred to as “survey fatigue”, describes a growing reluctance to participate in surveys due to their increasing frequency in the UK and globally, driven by both government surveys and commercial feedback requests (e.g. for online purchases). As well as an increase in survey frequency, technological advancements have meant that low-cost surveys can increasingly be created by those with little skill or experience in questionnaire design, which might exacerbate perceptions of survey fatigue (Field 2020).

Leeper (2019) describes this phenomenon as a “Common Pool Resource” problem, drawing an analogy to overfishing. Under this perspective, researchers and survey organizations "extract" responses from the public, but they do not always manage the resource sustainably. Over time, as people feel inundated with surveys, they opt out more frequently, leading to a depletion of willing respondents.

Presser and McCulloch (2011) found evidence of a sharp increase in US government surveys from 1984 to 2004, outpacing population growth. Survey fatigue has been empirically demonstrated by Eggleston (2024), who found a 15-point drop in US Census self-completion rates among those recently surveyed. Survey fatigue can also occur within a survey; Jeong et al. (2022) showed declining data quality within surveys when questions appeared later.

To combat survey fatigue, Field (2020) suggests using simple language, relevant questions, clear communication of value, and brevity.

Social capital and prosocial behaviours

Falling response rates have been linked to societal shifts in civic engagement and community connectedness, as well as trust in institutions (discussed below). These arguments draw on sociological theories. Specifically, Social Capital Theory (Putnam, 1995, 2001) describes “Social Capital” as the networks, relationships, and norms that facilitate cooperation and mutual benefit among individuals and groups. Under the Civic Disengagement Thesis (Putnam, 2000), Social Capital is posited to be eroding over time. This erosion might account, in part, for increases in survey nonresponse.

Repeated cross-sectional survey data can be used to examine overall changes in social capital over time, although these data should be treated with some caution because they are vulnerable to the very phenomena they seek to measure. The Scottish Government has observed a fall in Social Capital in recent years, mainly driven by a decrease in ‘social participation’, although not all dimensions of Social Capital fell (Scottish Government, 2024a). The Scottish Household Survey (SHS) data this draws on points to a marked decline in rates of formal volunteering over the past decade, where formal volunteering relates to structured activities – such as helping out in a charity shop, food bank, or community group – typically organised through an institution (see Scottish Government, 2023). In 2023, just 18% of adults reported that they had undertaken formal volunteering in the past year, down from 22% in 2022 and 26% in 2019. The longer-term trend shows an even steeper fall, with rates dropping by 12 percentage points since 2011 (from 30% to 18%). While these data do not directly identify the reasons for this decline, it is possible that residual effects of the COVID-19 pandemic, alongside broader societal changes, have played a role in reducing formal volunteering rates.

Research by Volunteer Scotland (2025) suggests that economic pressures linked to the cost-of-living crisis are a key factor. Rising living costs have made it harder for individuals – especially younger adults and those in low-income or single-parent households – to commit time and resources to formal volunteering. At the same time, organisations that depend on volunteers face increasing strain, with funding cuts and rising demand for services reducing their ability to recruit, train, and support volunteers. The Scottish Council for Voluntary Organisation’s Scottish Third Sector Tracker – a longitudinal survey that collects data from third sector organizations across Scotland – shows that the proportion of third-sector organisations citing staff and volunteer shortages as a significant concern rose from 35% in 2021 to 63% in 2023 (Scottish Council for Voluntary Organisations, 2023). Reduced organisational capacity could limit opportunities for the public to volunteer, even where willingness remains.

It is noteworthy that, in contrast to rates of formal volunteering, rates of informal volunteering – the direct provision of unpaid help to other individuals who are not relatives – has remained stable. Data from the SHS (Scottish Government, 2023) shows that in 2022, 36% of adults in Scotland reported undertaking informal volunteering, unchanged from 2018. The frequency of informal volunteering has also remained steady. This suggests that while economic and organisational barriers have constrained participation in structured volunteering schemes, broader prosocial behaviours seemingly persist in less formal settings.

Taken together, the available evidence suggests that the decline in formal volunteering in Scotland is more likely to reflect structural and economic pressures than a fundamental erosion of prosocial values. Nonetheless, it is possible that the overall fall in opportunities and engagement with civic activities may have knock-on effects for wider public participation, including in social surveys.

Bearing in mind the inherent challenges of using survey data to assess whether the willingness to participate in surveys is changing over time, one might look instead to administrative data on ‘prosocial behaviours’, which is unaffected by survey nonresponse bias. One might ask whether such data indicates that the public is becoming less community-minded and is engaging in fewer pro-social behaviours over time.

Taking the example of blood donation, a long-established civic activity, there is some evidence of a gradual decline in donor participation over recent decades. Media reports and figures cited in parliament indicate that Scotland’s pool of active blood donors has fallen over time, with donation having fallen by approximately 1% annually, while demand for blood products has risen by about 0.5% per year [1] [2].

However, there is a lack of publicly available data to verify the exact nature of trends in blood donation over time. Moreover, while any such changes might imply a decline in prosocial behaviour, it is important to consider a range of potential confounding factors. First, demographic shifts may have played a role. Scotland’s ageing population, in common with many other countries across Europe, has led to a shrinking pool of eligible donors (Greinacher et al. 2010). In addition, changes to eligibility criteria over time – such as those relating to foreign travel (due to tropical disease risk), tattoos and piercings, medications and surgeries, and more recently, COVID-19-related restrictions – have narrowed the donor base. Rising female employment over time and the growth of dual-income households may also have reduced donor availability during working hours.

Taken together, these factors suggest that the long-term decline in blood donation in Scotland may reflect a complex interplay of reduced convenience, eligibility changes, and demographic shifts, rather than simply an erosion of civic-mindedness.

Trust in institutions

Trust is generally viewed as a critical factor in survey participation, as it influences whether individuals perceive the survey process as legitimate and worthwhile. In societies where governmental or research institutions are highly trusted, individuals are more likely to view surveys as credible and aligned with public interest, thus increasing the likelihood of participation (Hofferth, 2001).

Conversely, in societies or communities where trust in institutions is low, individuals are more likely to be sceptical about the motives behind surveys. This scepticism can lead to higher nonresponse rates as potential respondents may fear misuse of their data or perceive the survey as intrusive or irrelevant. Literature shows that concerns about confidentiality and scepticism about the survey sponsor's intentions can significantly contribute to nonresponse (Groves et al. 2012). Moreover, levels of trust can vary significantly across different demographic groups within the same society, influenced by factors such as past experiences with institutions, media coverage, and cultural background (Silver et al. 2025). As a result, surveys conducted by organisations perceived as trustworthy and transparent are more likely to achieve higher response rates (Crane and Broome, 2017).

Of particular relevance to survey nonresponse, research has found public confidence in official statistics to have increased slightly between 2014 and 2023 (National Centre for Social Research, 2024a). On this basis, the evidence for a long-term weakening of trust in institutions as an explanation for long-term declines in survey participation could be questioned. However, as noted above, cross-sectional surveys are vulnerable to the phenomena they seek to measure. If low trust is a reason for increasing nonresponse, those who do not trust institutions are likely to be disproportionately opting out of survey participation. Survey evidence on trust thus cannot be taken as conclusive in this regard.

Moreover, more recent data indicates that the last few years have seen a low in terms of public trust in government in particular. For example, the 2024 British Social Attitudes report found that the public was as critical as it has ever been of how Britain is governed, and that a record high ‘almost never’ trusted governments of any party to place the needs of the nation above the interests of their political party (National Centre for Social Research, 2024b). Similarly, the Ipsos Veracity index has found that, while there is no evidence that trust is declining across the board, politicians remain stranded at the bottom of the ‘trust’ tables, having reached a 40-year low in 2023. As such, declining trust in government specifically could, potentially, be a factor contributing to post-pandemic response rate challenges, particularly on government surveys.

Crime rates

Higher crime rates – or more precisely, perceptions of unsafety, often shaped by local conditions – are associated with increased non-contact and refusal in face-to-face surveys, as individuals may avoid engaging with interviewers due to fears of violence, scams, or from general distrust (Groves and Couper, 1998; Durrant and Steele, 2009). For instance, Durrant and Steele examined how neighbourhood-level perceptions of safety in the UK influence survey nonresponse. They found nonresponse to be higher in areas where more residents reported feeling unsafe.

Available data does not support the hypothesis that nonresponse in Scotland can be accounted for by actual changes in violent crime. For instance, data from the Scottish Crime and Justice Survey (SCJS) suggests that rates of violent crime fell from 2008/09 to 2021/22, while property crime also fell over the same time period. Although rates of both increased slightly in 2023/24, they were similar to the pre-pandemic figures for 2019/20 (Scottish Government, 2025). However, although overall property and violent crime has fallen over the last 15 years or so, there are specific groups who are more likely to be victims of crime, including those aged 16-24, those in urban areas compared with rural areas, and disabled adults (Scottish Government, 2025).

UK-wide data from the European Social Survey also shows that the rate of people feeling unsafe or very unsafe after dark has halved from 36% in 2002 to 18% in 2022 (European Social Survey, Data Portal Rounds 1 and 10). Again, however, there are specific groups of people who remain more concerned about crime. For example, the 2023/24 SCJS highlights that women, disabled people and those living in the most deprived areas of Scotland are less likely to feel safe after dark than are men, non-disabled people, and those living in the rest of Scotland (Scottish Government, 2025).

It is possible however that nonresponse has increased given increases in fraud, scams and identity theft, which have become more prevalent alongside the increasing opportunities for such fraud afforded by developing technologies. The SCJS estimated that almost one in 10 had experienced at least one type of cyber fraud or computer misuse in 2023/24 (Scottish Government, 2025). The questions were only introduced in 2018/19, so it is too early to identify trends, although the number of police recorded cases of fraud more than doubled between 2014/15 and 2023/24 (Scottish Government, 2025). Fraud was also the type of crime people were most likely to say they were worried about. People were also more likely to think that fraud might happen to them in the next year, compared with other types of crime.

Social isolation and civic engagement

Social isolation has been explored as a potential contributor to rising survey nonresponse, particularly through the lens of the social integration hypothesis. This posits that individuals who are more socially integrated—through civic, family, or community engagement—are more likely to respond to surveys.

Amaya and Harring (2017), using data from the American Time Use Survey (ATUS) and the Survey of Health, Ageing and Retirement in Europe (SHARE), provide support for this hypothesis. Their study found that individuals with higher levels of civic engagement – such as voting, political discussion, or involvement in community groups – were significantly more likely to respond to surveys. These individuals were up to three times more likely to participate than those with low civic engagement. In contrast, more private forms of integration, such as family or neighbourhood engagement, had minimal predictive power once civic factors were taken into account.

This body of research is echoed in other studies which find that those who are more weakly integrated – socially or economically – are systematically less likely to respond to survey requests (e.g. Schoeni, 2013; Fitzgerald, Gottschalk, and Moffitt, 1998; Fitzgerald, 2011). It appears that it is not simply the presence of social connections that matters, but the public-facing nature of those connections. Civic behaviours may foster a sense of duty or a norm of participation that extends to survey requests, while private or informal ties may not have the same motivating effect.

Further support for the role of social networks in shaping response behaviour – and the argument that it is not only degree of integration but type of integration that matters – is provided by Eggleston and Sawyer (2024), who examined social network structure and nonresponse bias in the American Community Survey (ACS). By linking ACS outcomes with county-level data on Facebook friendship networks (from Chetty et al., 2022), the authors found that areas with more insular and homogenous networks – where people’s friends are also friends with each other – exhibited both lower response rates and higher levels of nonresponse bias. These effects persisted even after controlling for demographic and geographic factors, suggesting that individuals embedded in tightly knit, less diverse social circles may be more reluctant to engage with surveys, particularly when the source of the request is perceived as outside their group.

The household and the individual

Features of the household include the household structure and socio-demographic characteristics of its members. To this can be added features of the specific individual selected for participation. These factors should not be considered causal to the decision to co-operate. Rather, they tend to affect the likelihood of making contact as well as shaping psychological predispositions that affect the decision to co-operate.

Household-level characteristics

  • Household Size: Larger households are easier to contact (Stoop et al., 2010). While single-person households are harder to contact, they do not necessarily refuse more often once contacted (Durrant and Steele, 2009). In Scotland, Census data shows that single-adult households rose from 19% in 1981 to 37% in 2022, with the average household size falling from 2.75 to 2.12 people over the same period, reflecting societal shifts of an aging population and changing family structures.
  • Household Composition: Women, the elderly, and households with children have been found to be more likely to be home and to participate (Durrant and Steele, 2009), possibly due to lower workforce involvement (though women’s employment has been steadily increasing over time) or higher social integration. More recent research on the UK Household Longitudinal Study (also known as Understanding Society) has found that households in areas with higher proportions of dependent children were significantly less likely to respond (Xena and Kaminska, 2025).
  • Family working status: Over time, Scotland’s two-adult households have become more likely to be in dual employment, with households with only one adult of a couple in work falling accordingly, potentially making it more difficult to contact anyone in the household during the day (Scottish Household Survey). At the same time, the Scottish Household Survey has also found that the proportion of adults who work from home nearly doubled from 16% in 2019 to 31% in 2022. These more recent trends in home working imply that while interviewers might find it easier to make contact with an adult at a household during the day, adults might also have less time available to schedule an interview
  • Tenure: Homeownership has been found to correlate with higher response rates (e.g. Goyder et al., 1992). In Scotland, owner occupation has been relatively stable over the last two decades: 62% of all households in 1999/2000 and 64% in 2023. Private renting increased as a proportion of all households over the same period from 6% in 1999/2000 to 13% in 2023, though most of this shift happened prior to 2012.[3]
  • Access Issues: Almost a third (32%) of households in Scotland live in purpose-built flats (Scottish Census, 2022). Although the actual proportion living in flats has not increased over time (it was 36% in the 2011 Census), features like video doorbells or gated communities may reduce contact rates.
  • Urbanity: Urban residents tend to be less likely to agree to take part in surveys (Groves and Couper, 1998). This is likely tied to a wide range of factors such as higher crime, higher population density, and, potentially, weaker social cohesion.

Individual-level characteristics

  • Sex: Men tend to have higher nonresponse rates than women. It has been suggested that this might be because women are more often at home, or act as gatekeepers reducing men’s exposure to interviewers (Stoop et al., 2010). However, this explanation does not account for the more recent increase in the proportion of men working from home, which may have impacted on how easy they are to contact, though as discussed above, not necessarily their availability to take part at the time an interviewer first calls.
  • Age: A consistent research finding is that younger adults are less likely to co-operate with survey requests than are older adults (e.g. Glass et al., 2015; Hall et al. 2013; Wells et al., 2024). Older adults may tend to be more cooperative with face-to-face survey requests because they are at home more often (Stoop et al., 2010). But higher nonresponse among younger adults cannot be fully explained by greater difficulties with making face-to-face contact, as younger adults are also less likely to take part in surveys that use non-interviewer-administered modes, such as push-to-web surveys (for example, the Food and You 2 survey, and British Social Attitudes).
  • Ethnicity: Those from ethnic minority groups tend, overall, to have lower contact rates, possibly due to language barriers (Christensen et al., 2014). In Scotland, the proportion of the population from minority ethnic groups (excluding white minority groups) rose from c.1% (1991) to c.7% (2022 Census).
  • Education: Higher education correlates with lower nonresponse. Mulder and de Bruijne (2019) found that higher-educated individuals were more willing to participate across survey modes, while Braekman et al (2022) found that those with lower levels of education were less likely to participate, although face-to-face interviewing mitigates this effect. Potentially surveys may feel like tests, deterring individuals with lower levels of educational attainment (Stoop et al., 2010). There are also likely to be associated issues around literacy as a barrier to response (particularly to self-completion surveys).
  • Disability: Although we were unable to find evidence that disabled people are systematically statistically underrepresented in surveys (possibly because of the correlation between disability and age), various studies have identified accessibility barriers to disabled people’s participation in surveys. An ONS review of inclusivity in data collection found that even where advance letters offer alternative formats, participants were not always clear on whether these would apply to them. Qualitative research with Deaf people, people with visual impairments, neurodivergent people and those with mental health conditions found that all four groups were uncomfortable with the idea of having a stranger in their home, but noted they would be motivated by seeing the benefits to themselves or their communities (reported at ONS seminar for the Social Research Association (SRA), 5/12/24).
  • Personality: Socially integrated individuals are more likely to participate (Groves and Couper, 1998). Krosnick (2005) found that gregariousness boosts survey participation.

Individual-level attitudes

Groves and Couper (1998) explore the exchange hypothesis, which identifies perceived reciprocation or exchange of benefits as a key driver of nonresponse. According to this hypothesis, individuals are more likely to respond to surveys when the perceived benefits exceed the costs. While this overlaps to a degree with some of the ‘social environment’ considerations discussed above, as well as some of the interviewer-level behaviours and techniques which we discuss below, it also includes some elements that relate more to the individual’s attitudes to the specific survey, including:

  • Topic engagement: Participants are more inclined to engage in surveys if they find the topic interesting, especially when they feel their input could impact outcomes related to the subject (Joinson et al., 2007)
  • Opportunity to express opinion: People participate in surveys as a means to express their views, particularly when the topic resonates with their own interests or experiences. This motivation is stronger when participants believe their feedback can genuinely drive significant change or influence decisions (Groves et al, 2000, Singer and Ye, 2011).

The interviewer

Interviewer characteristics can influence the likelihood of respondent cooperation in face-to-face surveys. For instance, Groves et al. (1992) found that observable sociodemographic traits of interviewers shape respondents’ initial perceptions, while the interviewer’s mood can also affect cooperation. There is also some support in the literature for reduced nonresponse where interviewers and respondents share less observable attributes, such as educational backgrounds (e.g. Sun et al., 2021).

Jackle et al. (n.d.) analysed data from a panel of over 800 UK face-to-face interviewers to examine how interviewer characteristics influence cooperation rates. They found that interviewers with more positive attitudes toward persuading reluctant respondents – such as believing these individuals can be convinced and that their data are valuable – achieve higher cooperation rates, particularly among less experienced interviewers. Their research also found that more experienced interviewers consistently secured better cooperation, even after accounting for area characteristics, survey type, and demographics, although the benefits diminished with each additional year of experience. In terms of personality, extroversion was positively associated with cooperation, suggesting that sociable, outgoing interviewers perform better. Conversely, openness and (to a lesser extent) agreeableness showed a negative relationship with cooperation rates, perhaps indicating that overly accommodating interviewers may lack the persistence required to gain cooperation. Notably, the influence of personality traits lessened with increasing experience, implying that training and learned techniques can compensate for differences in natural disposition.

Durrant et al. (2010) also found that interviewers who express confidence in their abilities to persuade reluctant respondents achieve higher cooperation rates, even after accounting for their experience. Interviewers who rejected the idea that some individuals will never agree to participate had lower refusal rates.

Analysing fieldwork data from the European Social Survey, Blom et al. (2010) found that older interviewers were more effective in securing cooperation, potentially due to greater maturity, credibility, and life or interviewing experience. Attitudes again played a key role. Interviewers who emphasised professionalism – presenting themselves as trustworthy, friendly, and believing in the value of the study – were more successful in establishing contact. Furthermore, those with a positive orientation toward persuasion, including the belief that most people can be convinced with the right timing, achieved higher cooperation rates. Interviewers who reported constructive approaches to handling refusals, such as avoiding dwelling on the stated reasons for refusal but instead offering fresh, relevant reasons for participation, also performed better in persuading initially reluctant respondents.

Interviewer confidence also extends to item-nonresponse rates. Friedel (2020) found that on the Survey of Health, Ageing and Retirement in Europe (SHARE), interviewers who were more confident in their abilities to collect income data achieved lower item-nonresponse rates.

Householder-interviewer interaction

When an interviewer makes contact with an individual at a household, the individual’s decision to co-operate or refuse will be shaped by the various aspects of the Conceptual Framework of Survey Completion (features of the social environment, household, survey design, and interviewer). These factors will influence the psychological processes that come into play at the time of the interaction with the interviewer, including whether the individual activates a simple cognitive ‘script’ which dictates their decision almost instantaneously, or whether they engage in deeper processing relating to the topic of the survey and its perceived social value.

Interviewers can be considered to be ‘request professionals’ who, over time, build up a repertoire of approaches and techniques which help them to secure participation. The literature speaks to a number of methods through which interviewers can maximise their likelihood of success at the point of the interaction with the resident. Surveys will often include specific training modules during interviewer briefings to focus on these approaches. Ackermann-Piek et al. (2020) have examined interviewer training guidelines across the European Social Survey (ESS), the Programme for the International Assessment of Adult Competencies (PIAAC), and the Survey of Health, Ageing and Retirement in Europe (SHARE).

Methods that can increase co-operation include:

  • Comprehensive training regimes: Interviewers should be told about the importance of reducing nonresponse. “Train-the-trainer” approaches can be effective, to ensure that training staff are well-versed in techniques to minimise nonresponse. Study-specific training tailored to the survey’s target population and objectives is important; for example, ESS requires briefings for experienced interviewers and additional general training for inexperienced ones. Specific training in refusal conversion has been shown to reduce nonresponse rates (e.g., Cantor et al., 2004; Durand et al., 2006).
  • Tailoring: The importance of interviewers tailoring their approach depending on the features of different areas, households, and individuals is consistently cited across the literature. Generic introductions are to be avoided, as they increase the chance of a behavioural ‘script’ being activated which will lead to a refusal. Tailoring might involve emphasising the relevance of the survey depending on the age of the individuals, or changing one’s tone and speed of speech, as is recommended to interviewers in the SHARE study. Tailoring introductions should avoid the adherence to rigid scripts, and should permit the interviewer to adapt to respondent cues, and build rapport (Bradburn, 2015).
  • Identifying mental scripts: Interviewer needs to successfully ascertain whether a behavioural ‘script’ has been activated in the potential respondent, and if so, which one it is. For instance, if the interviewer believes that a “salesperson” script has been activated, they must quickly dispel this (e.g. “I’m not trying to sell you anything”).
  • A focus on the first few seconds: The first few seconds of the interaction are a “make-or-break” moment. Small, often unconscious cues from the interviewer—especially voice, tone, and style—can determine whether a potential respondent engages or opts out immediately (Bradburn, 2015). The introduction should be kept brief, should be delivered smoothly and naturally to avoid seeming scripted or insincere, and may use deflection or disarming strategies (like humour or politeness) to diffuse reluctance (Morton-Williams, 1993). Tailoring training to optimise this early impression could improve cooperation.
  • Maintaining interaction: More interactions and longer interactions are generally associated with increased chances of co-operation. This means that interviewers should retreat from a household before receiving a ‘hard’ refusal which cannot be reapproached. It also means extending the length of the interaction on the doorstep, given that the longer the interaction goes on, the harder it is for the potential respondent to summarily dismiss the interviewer.
  • Increasing confidence levels: Interviewers who expect nonresponse may inadvertently evoke it, creating a self-fulfilling prophecy. Training which strives to increase interviewers’ confidence levels is likely to be effective in reducing nonresponse, at both the unit-nonresponse and item-nonresponse levels. (e.g. Durrant et al., 2010; Friedel, 2020). Confidence extends to the interviewer’s style of communication. Olson et al. (2016) found that Interviewers with higher cooperation rates spoke to potential respondents in a more confident fashion. Their speech had fewer disfluencies and stutters, but they were more likely to interrupt respondents.
  • Refusal conversion techniques: Specific techniques to counter refusals are valuable. These can include reassurances about privacy, time commitments, scheduling, and the impact of the research. These techniques have considerable overlap with psychological techniques of ‘influence’ which we discuss next.
  • Reframing the first contact as a request for an appointment: Bradburn (2015) proposes that instead of framing the initial doorstep interaction as an immediate request to complete the interview, it may be more productive to frame it as an attempt to schedule a time for an interview. This subtle shift reduces the immediate burden and pressure on the potential respondent, and makes the request feel more reasonable and respectful of the potential respondent’s time.

Other methods include those arising from the social psychological literature, and specifically, from theories of influence (e.g. Groves, Cialdini, and Couper, 1992). Reciprocity and privacy concerns are of particular relevance.

Reciprocity

A key tenet of exchange theory, reciprocity suggests that individuals are more likely to respond to survey requests when they feel a sense of obligation to reciprocate a perceived benefit or favour (Cialdini, 2009). This concept can be strategically leveraged to improve survey response rates. Perhaps the most obvious attempt to do so is via incentives. Offering incentives for participation can trigger reciprocal responses, thereby enhancing survey response rates (Dillman et al, 2014). As was noted in our earlier report on Mixed Mode Surveys (Ormston et al, 2024), there is evidence that monetary incentives have a greater effect on groups less likely to take part in research more generally (e.g. younger people, those from deprived areas, and those from minority ethnic backgrounds), which suggests a likely impact in reducing bias (e.g. see Mack et al, 1998; Singer and Kulka, 2002; McGonagle and Freedman, 2017). However, non-monetary incentives can be more effective among groups already more likely to respond, or to specific sub-populations interested in the gift, so may not be as useful in reducing nonresponse bias. Incentives are also particularly effective for surveys with a lower baseline response, and therefore may be less effective for face-to-face surveys compared with other modes.

Becker’s findings (2023) suggest that participants who value reciprocity are more likely to respond promptly when they perceive the survey invitation as a kind gesture, inducing a social obligation to participate. Such strategies effectively tap into the psychological motivation to reciprocate.

Similarly, framing surveys as opportunities for individuals to voice their opinions and contribute to the improvement of services or products can significantly enhance participation rates. When individuals perceive their input as valuable and impactful, they are more likely to engage with survey requests. This is because the opportunity to express one’s views fulfils a psychological need for recognition and validation (Groves et al, 2000). Furthermore, highlighting the potential for respondents' feedback to drive change or improvement increases the perceived value of participation. This approach is linked to self-determination theory, which posits that fulfilling intrinsic needs such as competence, autonomy, and relatedness can motivate individuals to participate.

Addressing privacy concerns

Addressing privacy concerns is another pivotal aspect of mitigating nonresponse in face-to-face surveys. Assuring respondents their data will be handled with utmost confidentiality and explaining how privacy is safeguarded can significantly enhance willingness to participate (Couper et al, 2008; Singer and Couper, 2017). This can be particularly important in sensitive surveys where topics discussed may be personal or controversial (Tourangeu and Yan, 2007). Another effective strategy is to demonstrate the security measures in place for data protection. This could include describing how data is stored securely, who has access to it, and how long it will be retained. Offering respondents the option to review and consent to data handling practices can also foster a sense of control and security (Novak, 2014).

Training interviewers to handle privacy concerns adeptly is crucial. This preparation not only reassures respondents but also enhances the professionalism and credibility of the research process.

Other interviewer-related techniques or approaches from the social psychological literature include:

  • Authority: Positioning the interviewer as authoritative triggers compliance.
  • Consistency: Small initial agreements can pave the way for larger requests.
  • Scarcity: Highlighting rarity (e.g., "you are one in in 30,000 selected") can boosts participation.
  • Social Validation: Noting others’ participation sways respondents.
  • Liking: Rapport and shared traits (e.g., ethnicity, gender) improve likability and disclosure (Sun et al., 2021).

Contact

Email: surveystrategy@gov.scot

Back to top