Survey nonresponse research: appendices
Appendices to the Understanding Survey Nonresponse Behaviours main report, providing detailed information on each element of the research, including the literature review, analysis of nonresponse data, and qualitative research with interviewers, survey stakeholders, and the general public.
Appendix D: Qualitative research with the general public
Summary of key findings
Capability:
- There was a general lack of public awareness about Scottish Government surveys, with some participants directly linking their lack of awareness with their likelihood to take part in the surveys.
- Advertising surveys more widely, on traditional media and social media and in different languages, was suggested to increase awareness.
- The research highlighted challenges in encouraging people to open advance letters for surveys. Some said they would typically open any unexpected letter out of curiosity, while others indicated that letters which did not personally address them by name might often be considered 'spam', and therefore not opened.
- In general, if participants did decide to open the envelope, they felt that there would be a very short window of opportunity to encourage them to read the enclosed materials.
- Those with disabilities and/or those who were neurodivergent raised concerns about the accessibility of the letter, including concerns relating to the formatting, structure and amount and size of text.
- Disabled participants identified several practical barriers to participation in surveys, including the need for flexible scheduling to accommodate fluctuating health conditions, anxiety related to the interview itself and whether interviewers would understand their condition, and health and safety concerns about strangers visiting their homes.
Opportunity:
- Time: The estimated 45 minutes that the SHS takes to complete, on average, was considered by some to be too great a time commitment. However, responses to the idea of shortening surveys were mixed. Younger groups tended to favour shorter surveys, while older participants may accept longer surveys if they see the outcome as meaningful or if they know the time investment required in advance.
- There was some evidence to suggest that home working may reduce willingness to engage with an interviewer calling at their home.
- The physical aspect of surveys, specifically the idea of an interviewer visiting homes, was a particular concern for some participants. Suggestions for making the surveys more appealing included offering different modes of completion, such as a self-complete paper or online survey or option to complete via video-call. There was a general reference for taking part via video-call (though this may have been influenced by the format of these qualitative interviews).
Motivation:
- Distrust of government was notably higher among some disabled and minority ethnic participants. However, there were mixed views on whether and how trust would impact on likelihood of taking part in surveys.
There was limited evidence from these interviews to suggest that falling civic engagement is a barrier to survey participation, but more evidence to suggest some people do not see government surveys as something ‘for them’, making them less likely to take part.
- Concerns about data protection and misuse were raised as a potential barrier, particularly among those from disabled or minority ethnic backgrounds, who would want greater reassurances on the purpose of the research and how their data will be used.
- There was evidence to suggest that the frequency and volume of surveys in modern life has led to survey fatigue and a general sense of weariness and aversion towards participating in additional surveys, including for the Scottish Government.
- While incentives did not emerge as a primary motivator for survey participation, some thought that they should be compensated for their time and data. Others did not expect this from government surveys, as making a difference was seen to be an incentive itself.
Background and methods
Using a behavioural lens, qualitative research with the general public aimed to provide further insights into motivations and barriers to participating in government surveys. In this part of the research, the team had three main objectives when engaging with the general public through interviews and group discussions:
- To explore participants’ previous experiences of taking part in surveys, and their views on surveys in general.
- To understand in greater detail participant’s views on taking part in Scottish Government surveys specifically, especially focussing on any barriers to taking part.
- To uncover, and generate ideas for, potential solutions that might persuade participants or people like them to take part in Scottish Government surveys in the future.
Sample
Sampling for qualitative research with the general public focussed disproportionately on sub-groups of the population that tend to be underrepresented in surveys, including the three major Scottish population surveys. These groups were seen as likely to generate the most useful insights. The research team spoke to 43 people via a combination of focus groups, small group and individual interview. Participants were recruited via a combination of a professional fieldwork agency and third sector organisations that work with specific groups.
Focus group participants were recruited, as far as possible, to target quotas on gender, age, qualifications, whether they were living in flats, and urban/rural classification, to ensure a mix of people with different backgrounds and characteristics, including characteristics known to be associated with lower response rates.
Participants recruited via the professional fieldwork agency were also screened on two further criteria:
- Previous involvement in research (excluding those who had taken part in a research discussion in the past year and those who had taken part in a government survey in the past five years, or two years in the case of those with no or low formal qualifications).
- Likelihood to take part in a 45-minute survey for the Scottish Government (including those who said they are ‘6’ or below on a 0-10 scale on likelihood to agree to take part).
Disabled participants and participants from Black backgrounds were recruited by approaching third-sector organisations who invited individuals to come forward to take part in this research. Organisations that helped the research team in this way included SharpenHer: the African Women’s Network, VoiceAbility and Inclusion Scotland. For these groups, information was collected on previous research experience and likelihood to take part in a 45-minute government survey, but participants were not screened out on this basis, and strict quotas were not applied.
More detail on the characteristics of those interviewed is included in Tables D.1 and D.2, below.
| Group number / label | Recruited via? | Online or face-to-face? | Main inclusion criteria | Total number of people |
|---|---|---|---|---|
| 1 | Professional recruiters | Online | Aged 16-24, deprived urban area | 5 |
| 2 | Professional recruiters | Online | Aged 16-24, rural area | 4 |
| 3 | Professional recruiters | Online | Aged 25+, deprived urban area | 6 |
| 4 | Professional recruiters | Online | Aged 25+, rural area | 6 |
| 5 | Professional recruiters | Face-to-face | People from South Asian backgrounds | 7 |
| A | Third sector organisation | Online | People from Black backgrounds | 3 |
| B | Third sector organisation | Online | People from Black backgrounds | 3 |
| Depths | Professional recruiters | Online | Lower educational qualifications | 2 |
| Depths | Third sector organisation | Online | Disabled people | 6 |
| Depths | Third sector organisation | Online | People from Black backgrounds | 1 |
| Total | All | All | All | 43 |
| Demographics | Number of participants | |
|---|---|---|
| Gender | Man | 16 |
| Woman | 23 | |
| Non-binary | 1 | |
| Age group | 16-24 | 11 |
| 25-59 | 24 | |
| 60+ | 6 | |
| Qualifications | Low/no qualifications | 7 |
| Housing type | Living in flats | 14 |
| Ethnicity | South Asian background | 8 |
| Black background | 7 | |
| Disability | People with disability / long-term health condition | 9 |
Fieldwork
Fieldwork with general public participants involved nine interviews and seven group discussions. Six of the seven group discussions were conducted online, by video-call, with one in-person group in Glasgow with people from South Asian backgrounds. The number of people in each group ranged from three to seven. All of the nine individual interviews were conducted online.
This hybrid approach to fieldwork was taken to include a mix of people from different areas of Scotland, including more rural areas. The decision as to whether a group was run online or in-person was based primarily on consideration of how likely different groups of participants were to be able to engage effectively online, as well as the practicalities of recruitment and participant availability. All participants recruited through third-sector organisations were offered the opportunity to take part online via video-call, or by telephone.
In both interviews and group discussions, the research team used a topic guide informed by earlier stages of the research, including suggestions from the literature review, stakeholders, and focus groups with survey interviewers. The topic guide covered participants’ experience of research, reactions to the process of taking part in a real Scottish Government survey, and ways to make surveys more appealing to them and other people like them. To uncover views on a real Scottish Government survey, including whether participants thought that they would take part or not, the research team used the Scottish Household Survey (SHS) as an example.
Analysis
To facilitate a systematic analysis of the qualitative data collected from interviews and focus groups, the research team transcribed audio recordings from the interviews and groups. The research team developed a coding frame to align with the topic guide and hypotheses of this research and the data were systematically summarised and then synthesised to identify overarching patterns and themes, which form the basis of the findings presented in this chapter.
The COM-B behavioural model, described in the introduction to the main report for this research, was used to structure and inform hypotheses around the behavioural factors that might explain why people might not take part in surveys. These hypotheses were developed during the earlier stages of the project, drawing on the literature review, analysis of nonresponse data, and group discussions with professional stakeholders, and were further refined following the qualitative fieldwork and stakeholder workshop.
Both COM-B and these hypotheses were subsequently used to help structure topic guides and analysis of the general public groups and interviews, with the aim of providing further detailed evidence on the relevance of each hypothesis as an explanation of nonresponse, and to explore potential solutions. As such, this findings section is structured around the three elements of COM-B (Capability, Opportunity, Motivation), with sub-headings linked to specific issues and barriers relating to each of these.
Capability barriers
Low awareness
Overall, participants’ awareness of the SHS, SHeS, and SCJS was very low. Most said that they had not heard of the SHS (the survey that was used as a more detailed example in these interviews), and many of those who had heard of any of the Scottish Government surveys were aware of them in name only.
In terms of the impact on nonresponse, some participants did directly link their lack of awareness with their likelihood to take part.
“If you've heard of it before, then you're likely to take part in it because you're familiar with it.” (Group 2, aged 16-24, rural areas)
Participants were more familiar with the census, with some mentioning ‘census’ unprompted when asked what came to mind when they thought of the word ‘survey’ (although by definition, the census is different from a sample survey, since it aims to collect data from the entire population). However, in some cases, there was still some uncertainty about what exactly the census was.
“Census is the main one: just taking the temperature of a community” (Interview 3, neurodivergent participant)
“The word [census] itself resonates something in my brain, but I'm not quite sure exactly what it is.” (Interview 8, lower qualifications)
When asked if they had completed the census, those aged 25 or older tended to say that they had, although some could not recall completing it. Some participants mentioned that they had taken part because they thought it was compulsory or could recall receiving letters stating that they would be fined if they did not complete it, while others had filled it in because they saw it as important for updating population/demographic trends and supporting government decision-making.
Those in the 16-24 age group were generally still aware of the census, even if they said that other family members had filled it out. Those who were too young to have completed the 2022 census, were not in the UK at the time, or were not eligible (for example, because of immigration status or personal/health issues) tended to know less about it. However, in some cases, this did not affect how people viewed the importance of the census.
“Census is very, very important… it’s important for the purpose of budgeting, for resource allocation and all that”. (Interview 7, participant from Black background)
Participants suggested that advertising could make the surveys seem more legitimate, making people less wary about taking part. Advertising on TV, in newspapers, and through social media were all mentioned. It was suggested that running a social media campaign would reach a broad audience, and that a social media campaign may be more “interactive” than other forms of advertising (Group 2, aged 16-24, rural areas). TikTok, Instagram and Facebook were all cited, although there was no clear preference for a particular social media site, just to "get it in front of as many eyes as possible” (Interview 8, lower qualifications).
A few participants distinguished between Scottish Government accounts posting about the surveys, and a sponsored campaign. It was thought that the latter was likely to have a greater impact.
“If it was a post by the Scottish Government, and it came up on my feed as a sponsored post, like ’we’re doing this survey’, and then I got the letter about it, I would think this is actually a thing that I can take part in. It’s not a scam.” (Group 1, aged 16-24, urban areas)
More targeted suggestions included advertising in different languages on social media, and at disability support groups. It was suggested that this may increase awareness among these groups and, in the latter case, therefore reduce the “fear factor” of being randomly selected to take part (Interview 4, disabled person).
“If I'm sitting TikTok and I'm scrolling through all the Punjabi, Urdu ones and that [advert] comes up and it tells me that I can do the survey in Punjabi or Urdu then I'm more likely to look into it.” (Group 7, participants from South Asian backgrounds)
Other suggestions included advertising on bus stops and including QR codes for people to scan to complete the surveys (though this would not work for surveys based on probability sampling, as is the case for the three main Scottish Government general public surveys).
Opening and reading the advance letter
Participants in the focus groups and interviews were asked how they would react if they received a letter in the SHS advance letter envelope. Where possible, participants had been posted a copy of the letter in the SHS envelope in advance, to look at and open during the interview. Where this was not possible, or they had misplaced the letter, they were shown an image of the envelope, followed by a copy of the letter on screen during the group or interview.
Discussions with the general public highlight the difficulties of encouraging them to open an unexpected letter. Although participants tended to say they would open any unexpected letter out of curiosity, other participants said they would dispose of without reading any post which does not include their name. This would apply to all of the Scottish Government survey advance letters at present, since they are based on an address-based sample (the Postcode Address File), which does not include names of residents.
"Anything that says, like, the householder or the property owner. I'm a bit like, this is spam mail, and I put it in a junk pile." (Group 1, aged 16-24, urban areas)
A further group said that they place unopened post in a pile to sort through periodically, so would not necessarily have opened the letter by the time an interviewer called. Some participants noted that they tended to open items that looked official, in case they were in trouble for something or had a bill to pay.
Notably, a small minority of participants had misplaced or disposed of the letter they were sent by the research team in advance of interviews. This was the case despite them being informed verbally by a researcher or recruiter that a letter would be delivered to them to use during their interview, and even though the envelope sent to them (in which the advance letter envelope was enclosed to be opened during the interview) was in this case addressed to the named participant.
There were mixed views from participants on the perceived legitimacy of the SHS envelope. Elements of the envelope design that participants mentioned might reassure them or encourage them to open it included the “Scottish Household Survey” in the address, “Ipsos MORI” in the return address, or the logos in the bottom right (including the ‘National Statistics’ logo, which a few participants were familiar with).
On the other hand, some participants thought the envelope looked likely to be spam, a commercial survey, or a survey from a housing society. In particular, some participants were suspicious about commercial or private organisations trying to appear official in order to dupe people into taking part.
“I'm always wary of organisations trying to legitimise how they appear to the public. And they do things like suggestively position themselves in a person's mind that they are somehow official, that they're part of the government, when they're not” (Interview 2, disabled person)
It was noted that the envelope did not include the Scottish Government logo (this is included on the letter but not the envelope). Again, some participants said that they tend to disregard envelopes addressed to ‘The Householder’, while some said they expected that the government to know who lives at the address, so would expect it to be personalised.
"I probably would not open it. You know, if this is supposed to be some sort of government survey, you'd think that they would have some idea of who lives here" (Group 4, aged 25+, rural areas)
“I think if there was a way to make the actual outside of the letter seem less like spam mail… even if they just had, like, an actual Scottish Government logo on the outside of the letter, up in the top corner or something” (Group1, aged 16-24, urban areas)
In general, if participants did decide to open the envelope, they felt that there would be a very short window of opportunity to encourage them to read the enclosed materials.
“I think I open most letters and literally 5, 10 seconds make that decision if it's going in a bin or not” (Group 4, aged 25+, rural areas)
Aspects of the SHS letter which participants found appealing, and which might encourage them to read it included:
- The header “Help improve public services in [your local area]” – this stood out to some, as they felt that it was directly applicable to their lives, and because they were passionate about local issues and wanted to have their voice heard
- The Scottish Government logo made some more willing to take part
- The writing style was felt to be easy to understand
- Participants mentioned being reassured by the reference number and telephone numbers as well as the guarantees of confidentiality.
However, participants also highlighted various features they did not like, or which might put them off reading further:
- The letter was perceived to be very long, and many participants were unsure if they would have enough time (or would want to invest the effort) in reading the whole letter
- When observing participants read the letter, researchers also noted that the letter might not be read in full because it was double-sided and participants did not turn the letter over until much later in the discussion
- For some participants, mentioning that “An Ipsos interviewer will call at your home”, made them feel unnerved:
“The minute I see somebody's gonna call at the house in the next week or so, I think, oh, no, no, no, no, no, no, no, no. I'm not in. I'm gonna hide under the table” (Group 4, aged 25+, rural areas)
However, participants also noted that they would be even less willing to allow someone in their home if they were not expecting them, thus highlighting the importance of the advance letter in notifying them of an interviewer calling.
“My instinct is not to let someone I don't know or expect into my home.” (Group 3, aged 25+, urban areas)
There was a general consensus that the leaflet was more appealing to read than the letter. It was seen as concise, colourful, digestible, and with larger font and eye-catching graphics.
Accessibility issues and concerns
Participants with disabilities and/or those who were neurodivergent raised a number of concerns about the accessibility of the advance letter, including concerns relating to the formatting, structure and amount and size of text.
“I probably would want to take part, but I would lose interest in this letter after the first line… the way that it's formatted tells me that they're not interested enough in my opinion, to contact me in a way that's easy for me to process. So that would tell me right away, it's not for me.” (Interview 3, Neurodivergent person)
“If I were partially sighted, I would probably struggle to read this. You need bigger font, or perhaps yellow paper, to help partially sighted people.” (Interview 4, disabled person)
Suggestions to improve the accessibility of the letter included increasing the font size, emphasising the British Sign Language (BSL) option, simplifying and reducing the amount of text, offering an audio format, and limiting the letter to one side.
Disabled participants also mentioned concerns about the interview itself, which might impact whether they felt they would be able to participate. This included:
- Concerns about scheduling in relation to their condition (particularly where symptoms are fluctuating) – for example, a disabled participant mentioned that, due to their condition, they suffer from fatigue and might struggle to complete the interview at certain times of day. A neurodivergent participant mentioned that, on some days, they are unable to focus on a task or would need additional time to process the questions. This led to concerns about knowing when best to arrange an interview, and worry that interviewers may not understand their condition.
“If [the interview] was like all in one chunk and it was going to be over an hour or something like that, A: I'd start to get fractious and B: I'd get quite fatigued, so I'd ask them to break it into chunks.” (Interview 4, disabled person)
- Concerns about completing the survey itself, with participants mentioning negative experiences of completing previous surveys.
- Health-related concerns about interviewers entering their home, including whether interviewers would wear a mask, or concerns relating to anxiety.
"I don't have people in our house because I hate having to ask people to wear masks because half the time it ends up in a fight and I’m just done with it." (Interview 3, neurodivergent person)
In some cases, participants felt that they would still be able to take part but would feel more comfortable doing so if they could be reassured from the outset that interviewers were professional, flexible, and would be understanding of their specific accessibility requirements.
Opportunity barriers
Time
When asked for words and phrases that come to mind when they hear the word ‘survey’, concerns about the amount of time that is required was a recurring theme. Phrases such as ‘time consuming’, ‘repetitive’, and ‘how long is it going to take?’ were mentioned by participants across demographic groups.
Disabled people and carers, in particular, noted that they could be particularly time constrained, due to the additional time burden of managing their condition or caring responsibilities. For these participants, taking part in surveys was described as “one thing too many” on top of their existing commitments:
“A lot of my friends have got children with additional support needs and things like that. I mean, it's the last thing they need to do is fill out another form or get interviewed by somebody. Because that is our life. Like, we are literally on a daily basis just filling out forms, phoning companies, phoning hospital appointments. And it's just like, why would you add to that if you don't have to?” (Interview 5, disabled person)
The estimated 45 minutes that the SHS takes to complete, on average, was considered by some to be too great a time commitment. There was also a scepticism about whether it would take longer than this, in reality, with additional time potentially needed for introductions and small talk or if the interviewer was running late or needed to spend time explaining questions.
“Making an appointment 45 minutes, I think that's actually quite a long appointment to set aside for something that's not compulsory… We're all busy, we've all got our own things going on." (Group 4, aged 25+, rural areas)
“It's quite a long time. And then it won't just be 45 minutes because they'll be like coming into the house. It'll take time. It'll take more time than just that. It's just too long for me, I think.” (Group 2, aged 16-24, rural areas)
Some participants who had taken part in other surveys said that they were now less willing to take part in future surveys, because of the time burden that had been involved. However, time was not a concern for all participants. As one participant put it, “If I want to do something, I'll find the time." (Group B, Black women).
Participants also noted that their willingness to take part in a survey would depend heavily on the time of day and what else they were doing. There was some evidence that this was influenced by home working, with some noting that they might be in their home during the day but would not want to answer the door to an interviewer if they had work to do.
“You could be working from home, but that's the last time you want anyone to come and visit you because you've got work to do.” (Group 4, aged 25+, rural areas)
In terms of managing time concerns, participants suggested that being told the number of questions, as well as the expected completion time, would help them to envisage how long it would take to complete. Others suggested that they would be more willing to take part if they were told that this meant that they were assured that they would not be asked to take part in another survey again in the next five or ten years.
"The only thing I'd say to them is, ‘Is this like jury service where you're not going to come around and speak to me again for five years or ten years or something like that?’" (Group 4, aged 25+, rural areas)
Another suggestion was to offer the option of completing the survey in sections, rather than all at once.
Participants were also explicitly asked their views on shortening Scottish Government surveys, like the SHS. When asked if they would be more or less willing to take part if it was 20 minutes instead of 45 minutes, younger people tended to say that they would be more likely to take part. Shortening the survey even further, with the option of self-completing later sections online, was also considered preferable by some. Views were more mixed among older age groups. While shortening the survey would make some more willing to take part, others understood that there were trade-offs between survey length and the usefulness of the survey and thought 40-45 minutes was “probably about right”.
“You're not going to get the fidelity that you need if it's too short” (Group 4, aged 25+, rural areas)
“I think I'd be more inclined to engage with a 40-minute survey because if I knew it was coming and it was going to be meaningful, I would expect to embed that amount of time in it, a 20-minute thing would make me think ‘this is a bit superficial’.” (Group 3, aged 25+, urban areas)
Social norms about taking part
Earlier stages of this research, particularly focus groups with interviewers and other professional stakeholders, identified a perception that people were less ‘civically engaged’ or had a lower sense of ‘social obligation’ than was perhaps once the case. This perceived decline in ‘civic engagement’ was seen as a potential factor in increased nonresponse.
Perhaps unsurprisingly – given that participants were all willing to speak to a researcher to share their views on surveys, indicating at least a degree of civic engagement – there was limited evidence from the general public qualitative research to support the view that people in general have a lower sense of social obligation or civic engagement. Rather, participants generally said that they would, in principle, be willing to take part in a survey for the benefit of their local area, or to improve public services.
“I would personally do it just because I find it quite interesting and especially at the top, it says help improve public services while your local authority. So, I would want to help towards that.” (Group 2, aged 16-24, rural areas)
Some disabled participants, as well as some participants from Black backgrounds, mentioned that they would want take part to ensure that their community is represented. However, some participants did state that they were unlikely to take part in surveys that were not directly relevant to them, or their interests, suggesting that surveys do need to tap into people’s individual interests rather than a general sense of ‘social obligation’:
“If it was something that maybe didn't directly, again, relate to me at this point in time in my life, I would be less likely to spend time on it." (Group 2, aged 16-24, rural areas)
As discussed later (under hypothesis 11), there was more evidence to suggest that participants were less likely to take part because they did not believe that their engagement would, in reality, lead to positive change. Taken together, this suggests that the barrier may be less falling civic engagement or declining social obligation, and more a lack of faith in either government in general or the efficacy of surveys specifically to influence positive change.
There was more evidence from the general public qualitative research to suggest that some people do not see government surveys as something ‘for them’. When asked about what types of people typically agreed to complete Scottish Government surveys, some participants suggested people they felt were different from themselves, such as those who were more extroverted, more informed, more intelligent, or older.
“A lot of young people like from the ages form like 16 to kind of like my age, don’t take part in surveys. A lot more informed, intelligent people take part in surveys, maybe that are younger … but a lot of them that I know, people that are in an older demographic for the ages, maybe like 30 and a wee bit above that, they’re the ones that are actively taking the surveys. It’s mostly those kinds of people that are taking them, not really younger people.” (Interview 9, lower qualifications)
Among disabled participants, and those from Black backgrounds, there were also some who suggested that surveys tended to be completed by those who are already more “used to being listened to” (Interview 3, neurodivergent person).
Motivation barriers
Ease of opportunity for refusal
There was limited evidence from the general public qualitative research to support the hypothesis – identified in the stakeholder groups – that mentioning the option to ‘opt out’ in the advance letter would increase office refusals (though this was not something that was probed on specifically). When looking at the SHS advance letter, one participant said that they thought that the ‘opt out’ option was “reasonably attractive” (Group 4, aged 25+, rural areas). However, they did not indicate that they would necessarily opt out on this basis, but rather that they liked to know that they had the option to do so, rather than being “hounded” to complete the survey.
‘Social aversion’ and shifting expectations
Across groups and interviews there was evidence to suggest that some people are uncomfortable with having strangers in their home, making them less likely to agree to taking part in a face-to-face in-home survey with an interviewer.
When reading the SHS advance letter, there was a strong feeling of surprise among participants that an interviewer would need to call at their home to complete the survey. As discussed earlier, some were unnerved by the statement in the advance letter that an interviewer would call at their home. Some (particularly, though not only, younger age groups) were unclear as to why the survey could not be completed online. Concerns about the sustainability of interviewer travel were also mentioned.
“Why are they coming to my door? This feels like a bit of an antiquated system in the sense that that's a lot of petrol for somebody to be driving to somebody's house…I feel like there's much, much quicker, easier ways of communicating with people these days” (Interview 5, disabled person)
"I'm actually quite a private person and having a stranger in my house for 45 minutes asking me questions just makes me feel uncomfortable." (Group 4, aged 25+, rural areas)
However, at the same time there was some understanding that a face-to-face approach may help persuade more people to take part as “it's harder to say no to a person than to write on a piece of paper” (Group1, aged 16-24, urban areas). It was also noted that it could be easier to complete the survey if respondents can ask an interviewer if they were unsure about anything.
Participants described personal safety as a major concern in relation to letting interviewers into their homes, especially among women and those who lived alone. Some felt that they might feel a little more at ease with an older, female interviewer.
“If I don't have an appointment with you and you show up at my door, I won't open my door… For my own safety, I'm not going to open my door.” (Interview 7, Black background)
"I don't think I'd really be comfortable for someone being inside with me for 45 minutes if I don't know that person. Especially, and not to be rude, but especially if it was a guy, I wouldn't be comfortable for him to be in with me for 45 minutes." (Group 1, aged 16-24, urban areas)
Concerns were also raised around a perceived need to be hospitable and “host” the interviewer. This included worries around cleaning their home, having to make hot drinks, and ensuring there are no issues with pets or pet allergies. In some cases, participants comments highlight a lack of confidence in ‘norms’ around having strangers like an interviewer in your home.
“It could be awkward, I wouldn’t know how to react - do I have to offer them tea?” (Group 2, aged 16-24, rural areas)
Although it was difficult to ascertain whether people’s feelings about this had changed over time, comments indicated that for some people, the COVID-19 pandemic in particular had changed both how likely they were to have people in their home, and how they felt about this:
“Covid has very much changed how I feel about my home. … I have got used to no one else coming into my flat. Really used to it. And quite anxious about anybody actually being inside my home.” (Group 3, aged 25+, urban areas)
Participants’ spontaneous suggestions for making the surveys more appealing included offering different modes of completion, including a self-complete paper or online survey (although some participants noted that they struggle to fill in online forms and surveys), or completing it via video-call.
It was also suggested that people might be more willing to take part face-to-face if there was an option to meet the interviewer in a neutral, third space outside of their home, such as a coffee shop.
"I think the option to meet at an independent location should be an option, if possible, because you may not want strangers in your house […] You still want to take part, (but) it's on your terms." (Group A, Black and disabled participants)
When asked directly as to whether having the option to take part online or by telephone would make the survey more appealing, participants generally said that it would.
Video-call tended to be the most preferred option – it was seen as a “more modern, more up-to-date way of having a discussion” than face-to-face (Group 3, aged 25+, urban areas). It was also observed that offering the survey via video-call would save interviewer travel time and costs, as well as participant time, and that it would have the benefits of being face-to-face, without the drawbacks of needing to have someone in your “physical space”. However, there was also a sense that having more options was always preferable where possible, and each option might suit different people better. For example, participants with anxiety, agoraphobia or who were neurodivergent said that they would prefer to complete the survey on their own. Some participants who were neurodivergent and/or had mental health issues explained that they found telephone calls difficult because they were unable to see who they were talking to.
It is also worth noting that the preferences expressed in this research may have been shaped by the fact that in most cases, people were participating in interviews by video call. Take-up of video interview options on Scottish and UK government surveys has, to date, tended to be low when offered. However, while qualified, these findings perhaps suggest it may be worth continuing to explore this further, as more people (including interviewers) become comfortable with video call technologies.
Trust in government
Interviewers who took part in focus groups for this research mentioned that potential respondents sometimes cited negative views of government, either in general or of the current Scottish Government, as a reason for not wanting to participate in surveys. General public participants expressed mixed views on the Scottish Government, and mixed views on whether and how this would impact on their likelihood of taking part in surveys. Among those who were less trusting of government, low trust per se was not necessarily a barrier to participation.
“I'd still give them an opinion and tell them I don't trust them” (Interview 8, Lower qualifications)
However, a lack of faith that the government would act on the findings of the survey (discussed further below), was cited as a key reason why people might not want to take part. There was also notably higher distrust in government among some disabled and minority ethnic participants, linked to a perception (discussed further below) that data collected from minority groups has in the past been “weaponised” against those groups.
At the same time, knowing that the survey was government-led could have a positive impact, as it made the survey seem more legitimate, and gave the sense that the government had the power to act on the results.
“[Knowing that it was a Scottish Government survey] would make me more likely to take part because I feel like the Scottish Government have more authority to make changes based on the responses that they get from people taking the survey. So, it has more potential for widespread good changes and policies to be put in place.” (Group 2, aged 16-24, rural areas)
Beliefs about whether taking part makes a difference
Interviews and focus groups provided strong evidence of participants’ concerns that taking part in surveys would not make a difference or lead to meaningful change. When asked what words came to mind when thinking about the word ‘survey’, participants mentioned words such as “box-ticking” and “futile”. Some participants were cynical about whether decisions had already been made, and thought local and national governments use surveys to make them appear like they are listening.
“I've seen a couple of this government surveys and like brand surveys and surveys in general, I see them as quite useless, in my opinion, because you're asked to give your opinion and you're asked to give your perspective, and I would willingly give it. I'd willingly share my opinion. But in the end, I don't really feel they'd make change.” (Group A, Black and disabled participants)
“I've started to fill them in and thought they already have made their mind up what they're going to do. So I just don't bother. I just feel it's a waste of my time.” (Group 3, aged 25+ urban areas)
These concerns were especially pronounced among disabled participants, and those from Black backgrounds.
“What’s the point of me answering their questions if it’s not going to make any difference?” (Group B, Black women)
"Ordinary people who are on the breadline, who are suffering, who've taken part in surveys, and told the government what they think, and nothing gets done" (Interview 1, Disabled person)
When provided with the SHS advance materials, participants liked the reference to the survey findings impacting on public services in their area. However, some felt they had not seen tangible change over the period the SHS has been running, which again made them question the value of taking part:
“It does say here that [the SHS] has been going since 1999… If the end game is to use that information to make policy, then they've not done a very good job so far in the last 26 years” (Group 7, South Asian backgrounds)
However, others were more hopeful that taking part would make a difference, and a few said that they had taken part in surveys that they thought had led to positive change. Those who recalled taking part in the census generally understood its importance for measuring demographic changes, rather than directly impacting on their everyday lives.
When asked specifically about whether telling people more about how the surveys are used and their impacts would increase their likelihood to take part, there was general consensus that it would.
Suggestions for increasing the perceived impact of surveys included seeing how the findings have been used in the past, and having survey findings shared with participants to form a “feedback loop” (Group 3, aged 25+ urban areas). In some cases, participants said that understanding the impact was more important to them than a financial incentive.
“I think one thing that could improve people participating would be to start with explaining why the survey is being done and how the results will benefit the people that are doing the survey" (Group B, Black women)
“I think obviously like if they were getting paid for it, more people probably would [take part]. But I think more of the information being pushed to the people about how it's actually going to help us and how if we're answering these questions and there's more people and there's more information and there's more chances things being better for us” (Interview 9, lower qualifications)
However, there were different views among the general public on what impacts were positive, with some participants disagreeing with specific policies mentioned on the SHS leaflet, such as free bus travel for those aged under 22. This was linked to a distrust in the government to make the right decisions based on survey data. Therefore, while participants were keen to see that taking part would have an impact, they also wanted these impacts to align with their beliefs.
There was also evidence that some participants felt that surveys were not relevant to them. When asked about surveys that they may have previously completed, some participants mentioned that they had not been able to get their views across because of the way that questions were worded, because questions were not relevant to them or their circumstances, or because the surveys used only closed questions. This had angered participants and made them feel that the surveys had been engineered for pre-determined outcomes, making them less likely to take part in future surveys.
Generally, participants suggested that they would be more willing to take part in surveys that they perceived to be relevant to them. Participants suggested the topics needed to be relevant to them or to people they know, their interests, their local area, or their circumstances.
“[Surveys that are more appealing] are relevant to myself and my age group, although I do think things that will impact my grandchildren and my own children, I would quite willingly take part in things that I think would have a knock on effect for them, but I don't think there's any sort of subject that I wouldn't have an opinion on” (Interview 9, lower qualifications)
“Recently we've had a few up here where it's been related to ferries and air transport, which are quite important issues around here just now and people tend to be more motivated to join those because…there's the impression that they can shape the service” (Group 4, aged 25+, rural areas)
While participants generally liked seeing how survey results are used to inform specific policies, some participants felt that the policy outcomes were not always relevant to them or their lives. For example, some participants in rural areas felt that the examples of policy outcomes included in the SHS leaflet did not apply to them. In these cases, participants suggested that examples could be tailored to their local area, rather than what was perceived to be a focus on the Central Belt.
"I just thought, what does that have to do with anyone who lives in the country? Absolutely nothing" (Group 4, aged 25+ rural areas)
It was also suggested that more of a focus on community would make government surveys more clearly relevant to peoples’ lives and make people more willing to take part.
“Bring it down to people's level, they will understand. Oh, this is about us. It's about me. I believe they are likely to be more interested than if we just call it 'government'." (Interview 7, Participant from Black background)
Concerns about data misuse
Concerns about how their data would be used were spontaneously raised across the general public interviews as a potential barrier to taking part in surveys. Overall, unless the data collected was particularly personal or sensitive, participants were not overly concerned with sharing their views. However, some participants expressed anxiety about not having control over their data once collected, and having to rely on assurances that confidentiality and anonymity would be maintained. On the whole, disabled participants and those from Black backgrounds expressed stronger concerns about the potential for their data to be misused, linking this with examples of misuses of data (not limited to survey data) collected from their communities.
“History hasn't been the best friend of collecting information of minorities in any capacity… Yes, it can be used positively, but the reality is that that information can be weaponised.” (Group A, Black and disabled participants)
“During the start of Covid we found out that autistic people, for example, were being added to DNR lists without our knowledge or consent... And obviously I recognise that surveys are anonymised and that sort of thing. I get that. But disabled people are very used to having our disabilities weaponised against us.” (Interview 3, neurodivergent person)
Although these participants indicated that they might still take part, they would want reassurance about the purpose of the research and how their data would be used, and could be more hesitant to share personal or sensitive information with an interviewer.
At the same time, disabled participants and those from South Asian backgrounds also expressed views on the potential positive outcomes for their communities as a motivation for taking part in surveys. For example, some disabled participants mentioned that they would never turn down a survey about their specific condition, in case it results in new treatments, a change in policy, or more clinical research. Women from Black backgrounds stated that, even if they were not confident that their responses would directly benefit them, participating in research was their contribution to building a positive future for their children, or for other minority women.
“Survey overload”
Participants indicated that they are being asked to take part in numerous surveys in their everyday lives, making them feel as though they are being asked to give feedback on “almost everything that you do” (Group 4, aged 25+ urban areas). There was a perception that the number of surveys has increased dramatically in recent years, particularly since the widespread adoption of the internet, and that this made people less willing to take part.
"you reach a kind of saturation point or kind of survey fatigue where … you can't distinguish between what's important and what's not important. And just because someone tells you it's important doesn't mean it is. So if you’re constantly overwhelmed with surveys, you lose the ability, I think, to discriminate." (Group 3, aged 25+, urban areas)
"We all got spam mail back in the day. But I think now there's just no end and it seems limitless … I think the market's oversaturated and that brings resentment" (Group 7, South Asian backgrounds)
Generally, participants suggested that they understood the difference between government surveys and other surveys in principle, with some participants giving examples of government surveys that they had previously been invited to complete. However, in practice, as described above, it was not always clear that participants would recognise that the SHS was a Scottish government survey from the envelope and advance materials.
Participants were asked if changing the description of government surveys from ‘survey’ to ‘an important government study’, or similar, would make them more or less likely to take part. There was some evidence that this might help some participants distinguish between government surveys and other surveys, but others thought that calling it a government study was vague or preferred wording that specifically related to their local area or community.
“[Changing the description of the survey] might help with the differentiation between a survey from the coffee shop on feedback and research that is going to change the quality of your community life. There might be something about the wording, but because I think the strength of the world survey is being weakened by overuse.” (Group 3, aged 25+ urban areas)
Expectations of reward
As discussed in the literature review chapter, financial incentives have been shown to have a positive impact on response rates, particularly among underrepresented groups (e.g. young people, those in deprived areas). In general, financial incentives did not feature strongly among participants’ spontaneous suggestions for improving response. However, when prompted, mixed views emerged on whether participants thought government surveys should offer incentives and what impact doing so might have on their likelihood of taking part.
One view was that it was only fair that they should be compensated for their time or their information, and that they, or people that they knew, would not take part unless there was a financial incentive.
"The time that I was supposed to be resting or the time that I was supposed to be attending to my child or the time that I'm supposed to be doing my household or doing my work. Then how am I going to get compensated for that time?" (Group 5, South Asian backgrounds)
"They want my knowledge, they want my information. They've got to give me something in return.” (Group 4, aged 25+ rural areas)
Another view was that, if the survey was optional, being able to have their say and potentially making a difference was enough of a ‘reward’ for taking part.
“In the letter that they explain what it's for, why they're doing it, what the benefits will be, what they're going to do with it. This is an incentive in itself really” (Group 4, aged 25+ rural areas)
One participant suggested that being thanked for their time was enough, but if they were “not thanked at the end, even just a page that says thank you, I’m off, not doing another one” (Group 3, aged 25+ urban areas)
The monetary values that participants suggested would be reasonable incentives for Scottish Government surveys ranged from £20-£70. However, this may have been influenced by the incentive that participants were receiving for taking part in the group discussion. (Participants received vouchers for £40-£60, depending on the length of the interview or group discussion, and whether it was in person or online). There was also some concern that high value incentives would be unaffordable for the government, and wariness about using taxpayers’ money for incentives, with some suggesting they would perhaps take part for less than these amounts.
There was no single type of incentive that was universally popular. Suggestions included vouchers, BACS transfer, cash, money-off their household bills, or the chance to win a prize in a lottery or raffle. Those in rural areas suggested that they are not always able to spend vouchers in nearby shops, which make them less appealing.
Participants also discussed further benefits and drawbacks of incentives. These included a suggestion that offering incentives would attract different types of people from those who would answer otherwise, which would lead to a more diverse range of opinions represented. However, there was also a concern that incentives could lead to participants providing answers that they thought that the interviewer would like to hear, rather than their true beliefs.
“I do feel like if you offer a high incentive, then, I don't know if this is right, but it might make you feel like you need to give sort of correct answers or answers that they want to hear more because they're giving something to you... So it might create a little bit more bias.” (Group 2, aged 16-24 rural areas)
Contact
Email: surveystrategy@gov.scot