Developing a Mental Health Experience of Services (MHES) Survey for Scotland

This report explores and summarises the requirements from and options for a National Mental Health Experience of Services (MHES) survey. The MHES would gather regular data on service user experiences and help inform standards measurement and service improvement, policy development and NHS reform.


Chapter Six: Ethical considerations, accessibility, and data governance

This chapter outlines views on ethical issues, how these considerations can be managed, and privacy and data governance concerns[2].

Chapter summary

Key Findings and Further Considerations

  • Participants should be provided with clear, upfront information that:
  • Participation will not negatively affect their care
  • They can opt out at any time
  • The purpose of the survey is explained
  • Situations where confidentiality does not apply are clearly outlined
  • Challenges around informed consent may arise, particularly for individuals who are unwell and for children, if they are included in the survey scope.
  • The survey should be available in multiple formats and languages to support accessibility and ensure diverse voices are included.
  • Face-to-face interviews and collaboration with community organisations may help engage seldom-heard groups. However, these approaches are more resource-intensive.
  • Several actions can help minimise unintended harm and protect confidentiality, such as:
  • Avoiding sensitive information on survey invitations
  • Not reporting data from small service user populations
  • Data protection measures must be clearly defined, including:
  • Secure data storage
  • De-identification of responses
  • Inclusion of a privacy notice outlining these practices
  • Ethical approval and data governance processes may require significant staff time, especially if personal or health data is collected or used.

Ethical concerns and mitigations

Numerous ethical considerations were raised, with views on each of the following presented below:

  • Risks of unintended harm and respondent capacity
  • Ethical approval requirements
  • Consent and feedback

Risks of unintended harm

The risk of unintended harm to survey respondents was raised by several interviewees. These included:

  • Distress caused by asking participants to reflect on difficult experiences.
  • Considering some service user’s vulnerability given their poor mental health.
  • The potential for participants to think their survey responses could negatively impact their own care.

Some people may feel they are being labelled or stereotyped if the survey focuses too specifically on their diagnosis. Suggested mitigations included:

  • The need for sensitivity in conducting the survey and providing support for respondents if required.
  • Careful use of clinical terms and awareness that individuals may use different types of language to define and describe their mental health condition.
  • Survey completion instructions should clearly state that participation is optional and that participants can stop if they feel any distress while completing the survey.
  • Early-stage piloting and testing of the survey to ensure cognitive clarity and that the language is sensitive and appropriate.

When considering if there was a risk in not developing a MHES survey, interviewees typically stressed the need for the survey given the lack of existing data and understanding. However, a few respondents to the TLB survey suggested that resources for developing a survey could be better spent on services.

Respondents’ capacity to complete the survey was also raised. It was noted that some respondents may need support to express their views. Example groups included those with learning disabilities, low levels of literacy or, if the survey is solely online, those with limited access or capability with digital tools.

Ethical approval requirements

Interviewees discussed the importance of a full ethics review during the survey development process. Issues likely to be considered in such a review included safeguarding and wellbeing for both participants and researchers, the need to provide signposting and support, and the importance of recognising the wide range of mental health conditions and how they may affect a service user’s ability to complete the survey.One interviewee also noted that any data-sharing agreements would need to be included within ethics applications.

Examples from the evidence review: UK examples of surveys that recieveed ethical approval included the Scottish Government’s Scottish Health Survey, which received ethics approval from the Health and Care Research Ethics Committee for Wales, and the Maternity Care Survey, which received PBPP approval to use existing records to identify their survey sample.

Consent and feedback

Processes for gaining informed consent to participate in a MHES survey were also discussed. Some interviewees stressed the need for survey participants to have assurances that responses are anonymous and will not negatively affect future treatment. Explaining instances when confidentiality would not be upheld, for example, in relation to a disclosure which indicated a potential risk of harm to self or others, was also raised. Challenges in gaining consent from children, if included in the survey scope, and the potential role of parents in a consent process were highlighted, as were challenges around gaining consent from those who are particularly unwell.

Examples from the evidence review: The collection of informed and voluntary consent, that was anonymised and kept confidential, was mentioned in the YES Survey run by the Australian Institute of Health and Welfare, the Delaware State Government’s MHSIP Adult Consumer Survey and the New Hampshire State Government’s Community Mental Health Centre Client Satisfaction Survey.

Interviewees also reflected on the ethical use of the data collected. They stressed that data collection must have a clearly defined purpose that is communicated to survey respondents, that respondents are given opportunities to see the results, and that data is used in the way agreed to. One interviewee raised the possible risk of cynicism, burden, and survey fatigue among survey participants; they felt it was important to ensure that respondents believe they will be listened to and have their views considered.

Examples from the evidence review: The Experiences of Mental Health Care in Wales (Healthcare Inspectorate Wales and NHS Wales), Listening and Learning: Patient Satisfaction for Gibraltar Health Authority Mental Health Services (Government of Gibraltar), and the YES Survey (Australian Institute of Health and Welfare) all identified the risk that the same person could complete the survey multiple times. In the Listening and Learning study, it was also noted that people outside the intended group may have been able to access the survey link, as it was openly available.

The Community Mental Health Survey by the CQC in England detailed reduced risk of a social desirability bias from participants completing the survey themselves rather than giving feedback to staff. To address potential recall bias, they included those who were in contact with community mental health services at least twice, including once in the sampling period, reflecting over the last 12 months. They also had the option of sending prompt mailouts of the questionnaire. They noted that the validation rules added to the online questionnaire helped to ensure the survey was completed correctly. The risk of a non-response bias, i.e. those who respond are different to those who chose not to respond, was mitigated through comparisons for demographic variables, such as age and ethnicity in responders and non-responders as a proxy to address the level of non-response bias.

Accessibility and minimisation of barriers

Ensuring accessibility and capturing a range of voices in a MHES survey was emphasised in the interviews and TLB survey. This includes not only a diverse sample, e.g. by geography and ethnicity, but also groups such as disabled people, people experiencing more severe mental health conditions and people in poverty.

A recurring suggestion to maximise accessibility of a MHES survey was the need for multiple survey formats; in particular, making paper copies available if the main delivery mode is online. Other considerations included providing the survey in different languages, having telephone or face-to-face support for survey completion, British Sign Language interpretation, options to complete the survey over multiple sessions, interpretation services, providing prepaid return envelopes for posting, providing large print, braille, easy to read and audio versions, easy to use online survey design, using plain language, and using a set font size and spacing.

Considerations around survey length, question complexity, and comprehension were also raised. The need to avoid barriers such as a requirement to set a password or complex registration processes was highlighted. Potential cultural barriers to engagement were also described, such as some groups being fearful of sharing their personal information and cultural barriers to using technologies. Understanding cultural and language differences and clearly explaining the reason for the survey and why questions are asked were suggested to support overcoming these potential barriers.

To increase buy-in and engagement with a MHES survey, participants emphasised the need to communicate to participants and service providers that the survey data will be valuable and useful. A small number of participants raised the possibility of offering reimbursements to encourage completion and mitigate any financial impact (for example, loss of working hours) from participation.

Potential concerns with participant reach were also noted. For example, older people may be less likely to complete an online survey and that young people and people in deprived areas may have a low response rate. Potential mitigations included face-to-face interviews and work with community organisations to engage seldom-heard groups. However, participants noted that targeted efforts to engage specific groups will be more resource-intensive.

Examples from the evidence review: In the surveys reviewed, a range of options were provided for people to complete them – these included multiple language options (such as through translated information sheets to access translated surveys), varying from English and one additional language to 18 additional languages. Availability of helplines to complete the survey, provide translation services, answer questions and handle complaints were also identified. A few surveys noted providing accessible formats – such as easy read, large print, braille, British Sign Language, and screen reader compatibility. Freepost returns for postal surveys also occurred.

Data privacy and governance

A key data privacy risk is the unintended disclosure of a person’s mental health or service use as a result of their participation in a MHES survey. This was raised in both the TLB survey and interviews and is primarily related to a postal survey that may be opened by other members of the household and cause distress to the respondent as a result of unintended disclosure. The need to ensure that no sensitive information about a service or condition is visible on an envelope or survey invite was emphasised.

Another identified risk was the potential for anonymised data to still be identifiable. This was noted as a particular risk in small service user populations, for example, in a local service where staff could recognise a service user by their style of writing or other information, even if the data is anonymised. Interviewees noted the challenge of gathering local data to inform service-level improvements while protecting participant anonymity.

Examples from the evidence review: To preserve anonymity, the YES Survey by the Australian Institute of Health and Welfare did not collect personal identifiers (such as name and date of birth), and the Scottish Government’s Maternity Care Survey removed disclosing information when entering data, followed by quality checks.

Data governance was noted as a complex consideration in health care surveys. Issues raised included the need to follow the governance principles of the commissioning organisation, have robust data storage and management processes in place, ensure data protection and confidentiality, and the need to clearly communicate all these elements within privacy notices that accompany a MHES survey. Interviewees described that in Scotland mental health data is classified as extremely sensitive and requires an extra level of care and consideration.

One interviewee observed that the work required for data governance should not be underestimated, noting the need for approvals, determining how data will be shared, data protection impact assessments, equality impact assessments, and privacy notices. They stated that getting this in place along with approvals could be more work than the survey, also noting greater challenges with sensitive data.

If an existing data source is used to select the sample, ensuring that any contact with participants adheres to the privacy notice of the data collection is essential. The need to provide guidelines and training to staff for data collection and extraction was also emphasised. Consideration of the length of time the data is held was also suggested in the TLB survey.

Examples from the evidence review: The Scottish Government’s HACE survey, the 2022 MHSIP Adult Survey by the Oregon Health Authority, the YES Survey by the Australian Institute of Health and Welfare, and the Scottish Government’s Maternity Care Survey 2018 all collected, accessed and stored data securely on servers and in-person locations (depending on the mode of service delivery). Confidentiality was upheld and data storage aligned with privacy notices given to participants.

The YES survey destroyed personal information once invitations were sent, the Maternity Care Survey destroyed personal data and survey data after fieldwork completion, and HACE destroyed personal data at the end of fieldwork while keeping survey data until 6 months following the fieldwork.

Receival of approval and compliance was identified for some studies, including the Data Protection Act 2018 for the Community Mental Health Survey, PBPP approval and general data protection regulation compliance for the Maternity Care Survey and Scottish Cancer Patient Experience Survey.

Contact

Email: socialresearch@gov.scot

Back to top