Developing a Mental Health Experience of Services (MHES) Survey for Scotland
This report explores and summarises the requirements from and options for a National Mental Health Experience of Services (MHES) survey. The MHES would gather regular data on service user experiences and help inform standards measurement and service improvement, policy development and NHS reform.
Chapter Five: MHES survey administration
This chapter presents an analysis of views on different aspects of how a MHES survey could be administered, including:
- Survey frequency, time of year and length of time survey should be open for.
- Data collection methodology.
- Survey distribution.
- Use of incentives.
Chapter summary
Key Findings and Further Considerations
- There was a preference for conducting the Mental Health Experiences of Services (MHES) survey at a fixed point in time rather than continuously. An annual or bi-annual survey was generally considered sufficient to monitor significant changes and would be less resource-intensive.
- The survey should avoid being conducted during holiday periods.
- While there was no clear consensus on how long the survey should remain open, achieving a good response rate was considered essential.
- A primarily online survey is expected to be the quickest and most cost-effective method. However, alternative formats should be considered to improve accessibility for those who prefer not to complete the survey online.
- Support may be needed for different survey methods, such as:
- Online or telephone assistance
- Translation services
- Help with understanding or completing the survey
- A variety of distribution approaches were suggested, but there was clear concern about surveys being distributed directly by practitioners. The chosen distribution method will depend on factors such as:
- The services included
- Required response rates
- Accessibility of service user contact details
- Incentives may help improve response rates, especially among groups with historically low participation. However, their use would depend on available resources.
Survey frequency
TLB survey respondents would prefer a MHES survey conducted at a fixed point in time, rather than continuous data collection. While 25% preferred a continuous approach, 38% thought it should be run annually, 20% felt every two years would be sufficient, 12% thought every three years and 4% every five years.
Interviewees suggested a range of frequencies, from continuous data collection to every 6-12 months, every year, or every two years or less. Some suggested continuous data collection could be more beneficial for continuous service improvement. They felt this approach could allow for surveying after each engagement and focus on real-time data, allowing people to leave feedback at a time suitable for them and enabling timely identification of improvements.
A survey every 6-12 months was suggested by a few interviewees to provide more detail to support specific service improvement and monitor changes in performance over time. Some others, however, felt annual surveys would still allow data users to see trends over time, but would be more useful than a less frequent survey.
A few TLB survey respondents and interviewees with knowledge of conducting service experience surveys detailed their reasons for running surveys every two to five years. A key consideration for surveys being less frequent is that it reduces the resourcing requirements; these respondents noted that surveys are long, complicated and costly to run, and therefore doing so less frequently reduces the cost. One interviewee and TLB survey respondents expressed the view that if the survey has been designed effectively, this frequency should still allow for changes occurring between survey waves to be evident in the data.
Examples from the evidence review: The surveys identified were most commonly annual. One was every two years, and there was a range of other frequencies, including one-off surveys and some that were occasional/without set patterns of occurrence.
Time of year
Most interviewees were asked about the best time of year to conduct the survey. Most frequently, they suggested avoiding the Christmas period and the school holidays, as surveys conducted at these times typically have a lower response rate. Other suggestions for the best time to run the survey included:
- Running the survey at the same time(s) of the year each year.
- Aligning with the financial year to allow data to feed into reviews and planning.
- Aligning with other cycles of work undertaken by services, though no specific times of year were mentioned.
Length of time survey open for
Some interviewees were asked how long a MHES survey should remain open. A range of suggestions were offered, from three weeks to six months. The reasons given for these timeframes included:
- Keeping the survey period short to encourage prompt engagement.
- Acknowledging that most people take holidays of around two weeks, so a lengthy survey window may not be necessary to accommodate absences.
- Considering how response rates typically change over time.
Ensuring the survey remains open long enough to achieve the desired response rate.
Interviewees raised the value of sending reminders to encourage completion of the survey and allowing a few weeks between each reminder. There was also a suggestion to have a soft and hard close to the survey to account for late responses, though this would not be relevant if the data collection were continuous.
Examples from the evidence review: The surveys explored in the review tended to be open for a substantial period of time, to give people time to respond. The fieldwork windows ranged from six weeks to six months, with one outlier of three days (the 2022 Access to Care Survey in the USA by the National Council for Mental Wellbeing).
Data collection methodology
Interviewees’ key considerations for which data collection methods to use related to survey accessibility, removing barriers to participation, the availability of resources , and the number of questions and volume of data to be collected.. Four out of five responses to this question recommended using a mixed-method approach.
The advantages of online surveys were noted. They are generally considered to be more cost-effective, quicker to distribute and collect responses, familiar to most participants, and they eliminate the need for manual data entry. However, several accessibility challenges were highlighted. In particular, interviewees pointed out that online surveys require access to Wi-Fi and suitable technology, which can present issues related to digital inequality. Some individuals may struggle to complete an online survey, and it may be especially difficult for those with cognitive or visual impairments.One respondent stated that a higher number of people with severe illness do not complete online surveys. There was a concern that an online survey could risk exclusion and skew the sample and survey results. A suggestion was made for it to be possible for people to complete the survey over multiple sittings.
Only a few interviewees commented on the possibility of using text/SMS or mobile surveys and some NHS24 and NHS Inform services send text messages with links to web surveys to service users. Similarly, an example was given of invitational links being sent to a short Care Opinion survey, which includes a link to an optional longer survey at the end where the respondent can provide more detail. Another interviewee highlighted possible challenges around the collection of mobile phone numbers and queried how widely these might already be held within health services.
Given concerns about online surveys, interviewees described the potential role of paper surveys to support accessibility and improve response rates, i.e. a MHES survey could be primarily online but supplemented by a paper questionnaire for those who could not or would not complete online. However, the need to consider the font, size and spacing of text on paper surveys was raised. More broadly, one interviewee suggested using face-to-face survey methods to extend the reach of the survey and to have focus groups with communities. They felt this would give a more complete picture of service users’ lives.
Two interviewees noted the availability and use of helplines to support existing surveys. Respondents can use these to help them complete online surveys, request paper copies or large print versions of the survey, or request information more generally. Another interviewee suggested that respondents could have someone sit with them to help them complete the survey.
Examples from the evidence review: The surveys analysed in the review used a range of methods, with the most common being online (20) and paper surveys distributed by post (17). There were also examples of telephone surveys (11), paper surveys distributed in person/at mental health settings (7), face-to-face surveys (5) and text messages/SMS surveys (2).
Online surveys were noted as being the quickest and most cost-efficient for distribution and return, but there could be access and digital literacy issues. The ease of return is crucial for successful paper surveys, such as providing freepost return envelopes and/or convenient drop-off points.
Many used a combination of methods to reduce barriers to participation. For example, surveys may be ‘push to web’ – that is, they are primarily available online, and paper copies may be sent if requested or if there is no response within a specified timeframe. Some surveys offered a telephone helpline for people to request support with completing the survey.
Survey distribution
Interviewees were also asked how a MHES survey could be distributed. Depending on the data collection approach being used, a range of options were discussed, each with its own advantages and disadvantages.
For an online survey, sending the survey via email or text, using weblinks with access codes, or using QR codes to scan with phones were suggested. One interviewee suggested having QR codes on waiting room posters.
For a paper survey, suggestions included sending letters to potential suvey participants triggered by hospital administration and giving them out at clinics. However, interviewees also noted that because there are a wide range og mental health services, this method may be costly and challenging to implement. The advantages of issuing the survey in person are that the survey would be relevant to respondents as they would have just attended an appointment, it may also help to reduce risks of unintended disclosures, and it removes the need for participant contact details to be shared. However, interviewees also noted that some respondents may not want to complete the survey immediately after their appointment, as they may prefer to leave the service setting. This creates challenges around how the survey can be returned. It was also suggested that in-service delivery modes for a MHES survey could further strain on already stretched services. Interviewees also raised the risk of unintended disclosures with household members finding the survey if it is posted to respondents.
Concerns were raised that if the survey is distributed by practitioners, it may not allow for honest respondent feedback, with a subsequent risk of influence or bias. One interviewee stated that it may be more effective for an independent research company to distribute the survey rather than sending it directly from a doctor, service, or the NHS. This highlights the challenges that could potentially arise with data protection at all stages and the need to be clear about how personal, health and survey data is being collected and used.
Some interviewees suggested working with third-sector organisations to help distribute the survey and increase participation. These organisations were seen as having access to groups with lived experience, advocacy groups, and existing peer networks. One interviewee also suggested that peers and other researchers could carry out interviews or focus groups with service users. However, distributing the survey through networks could make it more difficult to link the responses with existing health data. It may also result in a less clearly defined sample, and there could be challenges in reaching areas where third-sector organisations are less active. On the other hand, this approach could help protect participants’ anonymity and reduce potential issues that might arise if the survey were distributed by practitioners.
One interviewee noted the importance of checking whether the respondent has died before sending out personal surveys so as not to upset the spouse or family member of a deceased person. They noted that this can be an administrative burden and can slow survey distribution.
Interviewees suggested marketing techniques to maximise reach and raise awareness of the survey. Suggestions included using newspapers, social media, community engagement, councils, volunteer organisations, advertising at football matches, and via radio stations. One interviewee suggested marketing the survey months before it comes out, emphasising its value and the difference it will make.
Examples from the evidence review: Most surveys reviewed were invitational (i.e. they were sent to a selected sample rather than being made available more widely). A range of techniques were used to recruit respondents – such as letters being sent to a selected sample, surveys being handed out to people at service settings, or through phone calls and emails. For example:
CQC Community Mental Health Survey: Materials were sent to NHS Trusts in England, which posted information to service users' home addresses with a URL to the online survey. This was followed by two postal and two SMS reminders. Experiences of Mental Health Care in Wales was administered through an open link distributed by social media.
The Scottish Government’s HACE Survey was posted directly to respondents with an online link, and it noted that a paper survey would be sent with a reminder.
Staff engagement efforts were important in the Mental Health and Substance Use Short-Stay Inpatient Experiences Survey, carried out in British Columbia by the Brisith Columbia Provincial Government – service users were invited to complete the survey onsite or offsite after discharge. Staff encouraged onsite paper completion paper to maximise response rates. Users were invited to take part 24 hours before discharge and handed a survey if they consented to take part.
Incentives
A few interviewees commented on incentivising respondents. Two had used incentives of £5 to £10 in their surveys, one of which targeted groups with low response rates. While incentives may be beneficial if the survey is long, their use would depend on the available budget.
Examples from the evidence review: Only a few examples of incentives being offered were identified among the surveys reviewed.
The British Columbia Provincial Government’s Mental Health and Substance Use Short-Stay Inpatient Experiences Survey offered a $5 gift card to service users who completed the survey before leaving the facility.
The Oregon Health Authority’s 2022 MHSIP Adult and Youth surveys offered a $10 gift card (Amazon or Starbucks) for online completion.
The New Hampshire State Government’s Community Mental Health Centre Client Satisfaction Survey offered a $5 incentive for completion.
The Scottish Government’s Scottish Health Survey offers participants a £10 Love2Shop voucher. This is the only Scottish survey identified in the evidence review that used an incentive.
Contact
Email: socialresearch@gov.scot