Long term survey strategy: mixed mode research report
Findings from research exploring mixed mode survey designs in the context of the Scottish Government’s general population surveys. The report details information on key issues, potential mitigations and remaining trade-offs, and includes 21 case studies on relevant surveys.
Appendix C – Topic guide for expert interviews
Introductions (5 mins)
- Check time still suitable and how long they have.
- Introduce self and role in research team – Ipsos working with Professor Peter Lynn, Lead on the ESRC’s Survey Futures project.
- Explain the research:
- Commissioned to inform the SG LTSS
- Focused on understanding pros, cons and possible mitigations associated with different mixed mode options for the three main SG surveys – the Scottish Health Survey, Scottish Household Survey and Scottish Crime and Justice Survey.
- Refer back to the info sheet and highlight key features of the three surveys:
- Briefly, all 3 surveys are large, random probability surveys of the general public
- All 3 use the postcode address file as their initial sample frame
- SCJS then select a random adult to participate in the doorstep. SHeS interviews all adults within a household (where possible) and up to 2 children. SHS interviews a householder and then a random adult (who may be different)
- All 3 cover very different topics, obviously. Some key components worth highlighting, in case they are relevant in terms of your views of what works/doesn’t work in different contexts:
- The SHS includes a travel diary, completed during the interview. SHeS includes a food diary, which is now completed online.
- SCJS and SHeS have self-completion sections (for more sensitive questions). SCJS is done via CASI/online. SHeS uses pen and paper booklets.
- All 3 surveys have modular structures, with allocation to different modules/streams determined either at sample level or based on responses to particular questions.
- SHeS includes interviewer measurement of height and weight and other bio measures (for a sub-sample)
- SHS includes a surveyor visit for a sub-sample to assess the condition and energy efficiency of their dwelling (as part of the House Condition Survey component of the study).
- To date, with exception of Covid period and an experiment with telephone data collection for the Crime Survey, all three surveys have been conducted F2F, so moving to mixed mode would be a significant departure.
- Anything you think is particularly relevant to the three SG surveys (either because of topic, previous design, or other features)
- Refer back to the info sheet and highlight key features of the three surveys:
- Aim is to ultimately to produce a framework that can help guide SG discussions and decisions re. this, tailored to reflect the specific needs of each survey.
- This will be informed by:
- Initial interviews/workshops with key stakeholders, which are focused on ensuring the research is informed by a clear understanding of your priorities for the surveys.
- A desk-based review of: a) literature on mixed mode surveys and b) a review of relevant surveys in the rest of the UK and elsewhere.
- These ‘expert interviews’ with people outside Scotland who have been heavily involved in mixed mode survey research and transitions
- A workshop with the SG to refine the framework.
- Main aim of the interview today:
- To discuss what you’ve learned from your work on what works and what the key challenges are in mixed mode research and survey transitions, especially:
- Anything you think is particularly relevant to the three SG surveys (either because of topic, previous design, or other features)
- And anything that you are able and willing to share that is not covered in currently published reports – e.g. what you wish you’d differently, practical issues you think are under- or over- played in the literature, etc.
- We’d also find it very helpful to get your views on any key publications/reports we might not be aware of that would be worth including in the literature review – especially anything that’s not readily available online or isn’t published yet but might be accessible early.
- To discuss what you’ve learned from your work on what works and what the key challenges are in mixed mode research and survey transitions, especially:
- Pleased you’ve agreed, but taking part is completely voluntary – if you don’t want to answer particular questions, or don’t want your answers attributed, just let us know. We can share a copy of our notes on your interview afterwards and you can flag anything you’d rather wasn’t included (either at all, or in an attributable manner)
- Would like to record interviews, with your permission, so we can listen back and make sure we’ve captured your views accurately
- Will be securely stored by Ipsos, not shared outwith the research team, and destroyed at the end of the project
- With your consent, we may also include quotes in the outputs.
- We would not attribute these to named individuals – e.g. would use ‘Expert interviewee 5’.
- However, we realise that you may still feel you’re identifiable from this – so if you’d prefer not to be quoted, that’s fine.
- We would also like to include a list of organisational affiliations of those interviewed for the research, so that those reading it can judge whether there are any significant omissions.
- We’ll come back to you at the end, and you can let us know whether you’re happy to be quoted and whether you’re happy for your organisation to be listed.
- We’re also happy to share a copy of our notes on the interview before you decide.
- Any questions before we start?
- Ask for permission to record – for ourselves to listen back to, won’t share with SG. Will hold recordings securely and destroy at end of project.
- OBS on – record consent to take part and be recorded.
Their role (5 mins)
- Ask them to briefly describe their role as it relates to survey research
- And specifically, their involvement in designing mixed mode surveys and/or transitioning surveys from one mode design to another? If not clear from pro-interview notes, probe on:
- Which surveys?
- What topics did they cover?
- What population did they focus on? (Gen pop or specific groups?)
- What modes did they move from/to (or were they designed as mixed mode)?
- How did they combine modes (in what ways)?
Learning and reflections from their work (45-50 minutes)
This section will need to be tailored to each interviewee and used very flexibly. For those not directly involved in transitions, or where considering transitioning a survey but haven’t done so yet, you might need to tweak the way some questions are framed.
- We’re interested in understanding your views, based on your work, on the key challenges and mitigations that can be put in place when transitioning surveys or mixing modes.
- We’re especially interested in anything that might not be in the published literature – for example, because it’s a detailed practical issue that might not have made it in, or because it’s something there was disagreement on, or something that is still being worked through.
- And on practical examples, not theoretical implications.
- Will ask you to reflect on this in relation to key areas we’re reviewing, but also welcome your views on anything else you feel you’ve learned. And if we ask about an area that you feel you have nothing to add beyond what’s already well covered in the literature, feel free to say so.
From the work you’ve been involved in, what have been the key challenges, benefits, and learnings around transitioning or mixing modes relating to:
- Question response patterns and accuracy of responses
- Probe on issues encountered / learnings in general relating to the impact of different modes on question response patterns/accuracy
- And in relation to sensitive questions particularly
- Anything from their work that might be relevant to types of questions asked on
- SHeS (health, health behaviours?)
- SCJS (experience of crime?) or
- SHS (very wide-ranging, but includes attitudinal questions, questions about satisfaction with services, travel behaviours).
- Data collection options – for example, challenges relating to:
- Questionnaire length - views on implications of different mode designs for survey length?
- And/or challenges/learnings around transitioning longer surveys, given all 3 SG ones are 40+ minute interviews (+ additional elements, like the surveyor visit or self-complete booklet)
- Structure – the SG surveys include quite complex modular structures / routing at the moment. Anything they can share relevant to this?
- Flexibility to accommodate changes to the questionnaire, etc.
- Questionnaire length - views on implications of different mode designs for survey length?
- Implications for sample design options
- How have changes in mode / MM designs impacted on sample size / design in practice?
- Have they been able to achieve bigger samples sizes (for the same resource)?
- Any insights on approaches to managing within household selection when moving away from F2F? Any MM options where this has proved more / less feasible?
- Any insights re. implications for sample boosts (e.g. in specific geographic areas)?
- Representativeness and non-response
- Learnings from their work re. implications of changing mode for representativeness and non-response (both)
- At overall level?
- For specific sub-groups?
- Including any views on sub-groups particularly relevant to the 3 surveys (people with LT conditions, victims of crime, people in different tenures/housing types).
- And specific parts of the survey
- E.g. sensitive questions?
- Anything relevant to biomeasures or external surveys of housing?
- (Key Q) How / to what extent was non-response bias mitigated / overcome?
- Learnings from their work re. implications of changing mode for representativeness and non-response (both)
- Survey quality metrics
- Learnings around both impacts of mode change / mixing modes on response rates
- In general and for specific sub-groups
- Probe for which groups RR falls most among and how manage this
- And experience of using alternative metrics to look at quality or sample bias – what have they used? How were received by stakeholders? What did they learn from this?
- Learnings around both impacts of mode change / mixing modes on response rates
- Impacts on trends and breaks in the time series
- Did changes in mode create discontinuities in the time series?
- Across the board or specific measures?
- How did you deal with this?
- What strategies are there for calibrating/adjusting estimates to try and avoid the break? How well do these work?
- What was the impact on the survey(s) – e.g. how they could be used by stakeholders?
- Did changes in mode create discontinuities in the time series?
- Financial and resource implications of transitioning between modes (short and long-term)
- Learning on extent to which, in practice, changing mode has impacted on costs
- In short-term – as re-design/move towards transition
- In long-term – once have transitioned
- Probe on any additional costs might not have anticipated
- And non-financial costs – e.g. commissioner time?
- Learning on extent to which, in practice, changing mode has impacted on costs
- Use of administrative data
- Any experience / learning around using admin data with, or instead of surveys?
- Views / experiences of this – challenges, benefits?
- Views on likely future scope to reduce use of surveys and use admin data more and / or for admin data to help address any of the gaps/challenges created by changing mode on surveys?
- External credibility
- Experience of external reactions to mode change?
- What did this depend on? What were the key things that seemed to impact on perceived credibility of any modal shift / MM design?
- ‘Future proofing’ and extent to which they think different designs have or have not achieved this
- How do you think about ‘future proofing’ in terms of mode design of surveys?
- Are there any factors that people should be thinking about now but perhaps aren’t?
- Overview and conclusions
- What are the most useful things you’ve learned from your work around MM that you think other people need to know / be thinking about more?
- What, if anything, are the outstanding issues or challenges that you still feel need more work to resolve in terms of managing transitions to MM designs more effectively?
- Are there any key reports – published or unpublished – you have written or are aware of but we may not have come across, that you would recommend including in our lit review?
- Anything else you’d like to add before we finish?
Ending and next steps (5 mins)
- Thank for taking part
- Check they are happy to:
- Be anonymously quoted?
- Have their organisation listed in the report?
- And whether they would like a copy of their notes (record this in recruitment sheet and record when you have sent these, via Itransfer and password protected).
- Discuss and agree next steps, e.g.
- Anything they’ve agreed to send us
- Or more general consent to get back to them if we have any further questions.
- Thank and close.
Contact
Email: sscq@gov.scot