Developing a Mental Health Experience of Services (MHES) Survey for Scotland
This report explores and summarises the requirements from and options for a National Mental Health Experience of Services (MHES) survey. The MHES would gather regular data on service user experiences and help inform standards measurement and service improvement, policy development and NHS reform.
Chapter Four: Survey content
This chapter details perspectives on possible content in a MHES survey, including:
- Survey length and completion time
- Survey questions
- Demographic information
- Period for respondents to reflect over
It should be noted, however, that several respondents highlighted the challenges of describing possible survey content without knowing which services or service users would be included in the survey.
Chapter summary
Key Findings and Further Considerations
- A Mental Health Experiences of Services (MHES) survey is expected to take between 5 and 30 minutes to complete. A shorter survey may be more suitable for some service users, likely resulting in higher response rates and lower costs.
- The survey length must be carefully balanced to ensure that all relevant information and voices are captured.
- There was agreement that the survey should include a mix of closed and open questions, covering key aspects of service use such as:
- Accessing services and support
- Waiting times
- Transitions between services
- Overall experiences
- Survey questions should align with the Mental Health Core Standards. While a wide range of measures could be included, only relevant information should be captured to keep the survey manageable.
- Aligning the MHES survey with other existing surveys was considered helpful, but there is a need to assess whether this could limit the effectiveness of a Scotland-specific survey.
- Key demographic information should be collected if there is a clear purpose for its use. However, asking too many demographic questions could lengthen the survey and discourage participation. Consideration should be given to using demographic data from other sources to reduce the burden on respondents.
- There was no clear consensus on the time period the survey should ask service users to reflect on. The long-term nature of some individuals’ experiences with mental health and services makes defining an appropriate reference period more complex than in other service user surveys.
Survey length and completion time
Some interviewees suggested how long they felt it should take to complete a MHES survey. Suggestions ranged from 5 to 30 minutes, based either on their own service user experience surveys or what they felt was appropriate for a MHES survey.
Participant concentration and retention are important considerations in relation to survey length, and some advocated for a shorter survey to maximise participant engagement. It was suggested that some service users, due to the severity and impact of their mental health conditions, may struggle with focus, or find it difficult to articulate their experiences through a survey. It was also noted that potential survey length and completion time need to be considered alongside which services users had engaged with, how much time has passed since someone had engaged with a service, and their readiness and ability to complete a survey about their experiences.
The challenges of balancing a possible desire from some data users to ask lots of questions with the amount of time it takes to complete the survey were noted, as interviewees acknowledged that a longer survey would likely reduce response rates and, therefore, sample sizes and potentially the robustness of the data. It was suggested that closed questions would enable a faster survey completion time. However, there was recognition of the value of including open-ended, qualitative questions to gather richer details about experiences. It was also suggested that if the survey questions are well-designed, and based on the target groups in question, there may not be as much need for open questions.
Survey questions
TLB survey respondents were asked what a MHES survey would need to cover to meet their or their organisation’s data needs (see Figure 4 overleaf). The most commonly selected response from the multi-choice question was accessing services and support (83%). This was followed by waiting times to access services, support and treatment, and the impact of this (79%), moving between services and support (including signposting and referrals) (73%) and overall experiences (73%). Of the twelve options to select from, 10 out of 12 were selected by more than half of all respondents.
TLB survey respondents were asked a free text question: ‘Please provide details of any specific questions you feel are essential to be included in an ‘Experience of Mental Health Services and Support Survey’, and why they are essential to include’. 51 respondents provided a response, and just under half (24) wanted questions about the experience during the service, followed by experiences of getting to services. Similarly, interviewees detailed a range of topic areas and specific questions to consider including in a MHES survey. However, interviewees noted the need to be mindful of the survey completion time and to limit the survey content to the most relevant and useful questions.
Frequently suggested topic areas often reflected key points in the service user journey and included the following. Specific questions which were suggested to include under these headers are detailed within Appendix A.
- The process of referrals, including from GPs and the accessibility of services
- The experience of being on a waiting list
- The experience of receiving treatment
- Support provided post-service use
- Questions for family, friends, carers and staff
Some interviewees noted the value of capturing data on service performance to understand how effectively those services are being delivered and whether they meet the intended performance levels. Questions that capture examples of good practice were also suggested. Requests were made to include both qualitative and quantitative questions, in particular qualitative questions to capture experience.
Interviewees detailed considerations for the terminology, length and formatting of survey questions. General comments on the need for careful question phrasing and clarity of meaning were made. One interviewee noted that the use of the terms ‘patient’ and ‘service user’ may be disempowering for some people if they do not see themselves as patients or service users. Another interviewee suggested avoiding complex mental health terms to support clearer understanding of the survey questions. The need for a trauma-informed approach and questions was also noted. One interviewee suggested it was more helpful to focus on negative experiences, as these could be more likely to produce actionable changes.
While participants suggested that a consistent set of questions would support comparisons between services and over time, they highlighlighted the challenges with using one or similar sets of questions if trying to accommodate the variety in mental health services. If the same set of questions is used, it was suggested to be careful in phrasing the questions so they are relevant to all respondents. It was also noted that in some instances, the patient experience may differ from the professional perspective (i.e. they may reflect on a negative experience which professionals viewed as the best course of action). One interviewee suggested having customised surveys tailored to individual services, in addition to core comparable questions.
Some interviewees were asked whether the MHES survey questions should align with the Mental Health Core Standards. Many felt that they should align with the Core Standards, but also noted that it may not always be straightforward to link the standards to the questions. It was suggested that survey data could be useful for evidencing the standards, making them less theoretical, and confirming if they are being upheld. For example, a major theme of the Core Standards is self-reported outcomes (i.e. whether an individual feels better or worse after using a service). Consequently, a MHES survey could be a useful vehicle to collect information about experiences which could inform outcomes. It was also suggested that there could be value in aligning the survey with the Scottish Government’s Mental Health and Wellbeing Strategy.
All TLB survey respondents were asked, ‘How important is it that an ‘Experiences of Mental Health Services and Support Survey’ is comparable to existing mental health experience surveys or other metrics used in other parts of the UK?’[1] Most (88%) thought it was very or somewhat important that the MHES survey is comparable to existing mental health experience surveys or other metrics used in other parts of the UK.
Interviewees reflected that while there could be advantages to benchmarking and comparing data to other countries’ mental health surveys, challenges were also likely. For example, having to use the same question set to allow for comparison could limit the usefulness of a Scottish survey, and it may be difficult to compare different mental health systems.
Examples from the evidence review: A range of experiences of services was captured within the evidence reviewed, including: access to services, arrival at services, the views of the mental health team, care and treatment received, discharge arrangements, and overall satisfaction.
The Scottish Government’s HACE and Cancer Patient Experience surveys may have questions that could be adapted for a MHES survey.
Examples of validated scales being used were identified. For example, the MHSIP questionnaire is used across the United States and measures concerns that are important to users of publicly funded mental health services – used in several of the US surveys we reviewed. Also, the Canadian Mental Health Client Experience Questionnaire was identified as using a validated survey tool.
Demographic information
Most TLB survey respondents indicated that a range of demographic and equalities information should be collected in a MHES survey. Most commonly this included age (92%), chronic health conditions and disability (80%), deprivation (75%), location – rural/urban (68%), race/ethnicity (66%), work/employment status (66%).
Interviewees were asked whether demographic information should be collected as part of the survey and, if so, what should be collected. Three key areas for consideration were evident in comments from interviewees:
General need for and usefulness of demographic data:
- Exploring differences in experiences by demographic and equality information can be useful in informing service improvement.
- Further positives of collecting demographic data include addressing stigma towards accessing mental health services and understanding who is and is not accessing services.
- The collection of demographic and equality data must be for a clear purpose, and it must be used in the way intended. Only necessary information should be collected.
- The level of demographic information to capture will depend on the survey scope (i.e. if the sample size is too small to allow for comparison between groups, there is less need to collect demographic information).
Ethics and reducing participant burden:
- The collection of demographic and equalities data can be a barrier to participation. Some respondents may not be comfortable sharing personal or private information, and too many demographic questions could increase the length of the survey which may be off-putting.
- Demographic and equalities questions could be optional to reduce barriers to participation. However, for completeness of the data set it would be beneficial to ask respondents all of the demographic and equalities questions but include a ‘prefer not to say’ option.
- If the participants were sampled from an existing data source, some demographic and equality information may already have been collected and would, therefore, not need to be repeated in a MHES survey.
Specific demographic and equality data to collect:
- Suggestions included age (which could be by age band), gender and sex (the phrasing of these questions was raised as being particularly sensitive), ethnicity, geographical information (postcode/SIMD/rural/urban), disability or long-term health condition, neurodiversity, sexuality, religion, and working status.
- Those with experience of conducting service user experience surveys noted they also collected housing/accommodation status, caring responsibilities and other information relevant to their surveys.
Examples from the evidence review: The extent to which demographic questions featured in the surveys identified in the evidence review depended on survey objectives and the required analyses. This included balancing the need for data with the length of the survey. Examples of demographic questions used included age, sex/gender, ethnicity, health conditions, location/rurality, deprivation/income and sexual orientation.
Period of reflection
Seven interviewees commented on the period that survey respondents should reflect over when completing the survey. However, there was no consensus on this point. Key considerations in this decision include:
- The period over which people will be able to recall
- If respondents have had time to reflect on their experiences
- Whether an immediate response to their experience is needed
- Service users wanting to move on and not reflect on their service use experience
- Capturing longer-term outcomes following service use
- The variety of services that may have been used over the period they are reflecting upon
- The inclusion of historical service use in a survey
For interviewees with experience of conducting service user experience surveys, periods of time participants were asked to reflect over varied from the point of admission to the point of discharge, the last 12 months, and the last two years. Considerations for the time periods selected included memory recall and participant appetite to complete the survey. For example, If views on specific appointments are needed, then it could be the case that the sooner the survey someone received the survey the better. Interviewees also suggested specifying the period to reflect over within the survey questions and potentially doing this upfront to reduce question length. If people on waiting lists are included within the survey, it was felt that period to reflect over may not be as relevant to them.
However, interviewees also noted that it may be more challenging to determine the time to reflect over for a MHES survey. This was because many service users may have long-term treatment, some engagement with services may not have a clear end, and the effects of engaging with services may not be felt immediately, unlike for example a hospital stay following an operation. It was thought that limiting the period of reflection may limit the capture of people’s history of service use. However, one participant suggested that asking service users to look back further than a year is too long.
Examples from the evidence review: The experience surveys identified tended to ask respondents to reflect on their experience of a service during a specific time period. In most cases (9), respondents were asked about their experience of services during a period of between one and six months (for example, ‘thinking about your experiences of x service between April and July 2023…’).
Contact
Email: socialresearch@gov.scot