Developing a Mental Health Experience of Services (MHES) Survey for Scotland

This report explores and summarises the requirements from and options for a National Mental Health Experience of Services (MHES) survey. The MHES would gather regular data on service user experiences and help inform standards measurement and service improvement, policy development and NHS reform.


Chapter Seven: Data analysis, reporting and dissemination

Interviewees and TLB survey respondents were asked about how data from a MHES survey could be used, desired formats and levels of reporting, and possible approaches to disseminating and publishing survey results. Perspectives differed by the type of data user, and a range of potential approaches was identified in the evidence review.

Chapter summary

Key Findings and Further Considerations

  • Local or service-level data was considered most useful for driving service improvements and meeting the needs of a wide range of stakeholders.
  • The number of potential reporting variables in the survey data should be considered in advance to ensure robust sample sizes at the local or service level. Transparency about the level of reporting that will be possible is important.
  • Services should be supported to understand and make effective use of service-level data.
  • Tools or systems should be included to help identify significant changes or challenges in service delivery, and to suggest possible responses.
  • Results should be published at least at the national level to support transparency and meet stakeholder expectations.
  • Publication must align with ethical considerations, including:
  • De-identification and anonymisation of data
  • Ethical use of data given the effort and resources required to collect it
  • Using data in the way described to participants
  • All parties should be clearly informed about how the data will be used, including:
  • Benefits to survey participants and service users
  • Improvements to mental health services
  • Support for funding decisions
  • Addressing service gaps
  • Informing government policy and planning
  • Stakeholders may require results in different formats, such as:
  • Written reports
  • Dashboards
  • Raw data
  • While producing multiple outputs may increase costs, focusing on national and local reports would meet many users’ needs.
  • Publishing results at a local or service level raises risks of respondent identification and the creation of service “league tables,” which could undermine trust in the survey among service users.

Survey reporting and dissemination

TLB survey respondents were asked what they thought would be the most useful format for reporting on the results from a MHES and could select up to three options. There was a clear preference for local results (64%), with national reports (45%) and interactive dashboards (44%) also considered useful. Less likely to be considered useful were reports with bespoke analysis aligned to needs (31%), and provision of anonymised raw data to enable local analysis (30%).

Figure 7. Q14: What format would survey results be most useful for you? Please select up to 3. Base = 104.
Figure 7 is a vertical bar chart showing responses to Question 14: “What format would survey results be most useful for you?” (Base = 104). The most selected option was “Local results report” at 64%, followed by “National results report” (45%) and “Interactive dashboard” (44%). “Bespoke analysis report aligned to needs” was chosen by 31%, and “Anonymised raw data for local analysis” by 30%. Smaller proportions selected “Don’t know” (9%) and “Other” (2%). Respondents could select up to three options.

Data users and uses

Key data users are likely to include individual services, policymakers, mental health charities and advocacy groups, funders, and decision-makers across mental health provider organisations and at various levels of government.

Interviewees felt that to encourage engagement in the survey, the purpose and value of the data must be communicated to respondents, and for credibility, the data must be used for the purposes described. Lived experience interviewees stressed the need to share data and reports with survey participants and to make data publicly available to provide transparency, whatever the results.

Examples from the evidence review: The evidence review identified a range of data user stakeholders, including commissions (e.g., CQC in the Community Mental Health Survey, 2023 Maternity Survey), NHS trusts and commissioners, NHS Scotland and NHS England, the Scottish Government and the Department of Health and Social Care (in the Community Mental Health Survey, 2023 Maternity Survey), and mental health care providers.

Possibilities for data use depend on how much useful information is gathered from a MHES survey. There was an overall view that significant time, effort and resources will be spent on creating the survey and on subsequent data collection, and that any data and insight generated from the survey must be used effectively to ensure there is a return on the resources and time allocated to it.

The need for a MHES survey was repeatedly emphasised by interviewees, who reflected that data and evidence about mental health service user experiences are currently unavailable on a national scale. The potential for a new survey to capture and analyse data about positive and negative experiences, service pathways and access to holistic support such as outpatient care was highlighted. Interviewees also described the possibility of using the survey data to identify any poor service, which may create risks or harm to service users, with procedures in place to flag any concerns and ensure they are addressed quickly.

Examples from the evidence review: Improving services for those who use them was identified as the key data use across the surveys included in the evidence review. Specific uses depended on the type of data user. For individual services, survey data was used to gain greater insight into service user experiences and contribute to continuous improvement. For those in decision-making levels, key uses of survey data were to hold services to account, measure outcomes against frameworks and strategies, identify service gaps and compare data across years.

Some interviewees expressed concern that any negative results could cause tension within and between services or have a negative impact on services. They suggested the survey should strive to achieve an accurate and balanced picture of the full range of experiences, including ‘middle voices’, i.e. those who have neither extremely positive nor negative views.

One interviewee suggested that universities should be given access to de-identified data to conduct their own, more detailed analyses. They observed that universities have the skills to analyse large volumes of data and could potentially link MHES survey data to other existing data sets.

One interviewee highlighted the need to ensure that, once the survey results are available, there is a plan for findings to be embedded within governance systems, processes, planning and local discussions.

When to access data

TLB survey respondents and five interviewees, four of whom were data users, were asked how long after the data was collected they would like to be able to access the data. A few interviewees stated they would want access to the data straight away or as soon as possible. One quarter (25%) of TLB survey respondents felt that access to the data within a month would be useful, while 44% selected one to three months. Having a live dashboard was also suggested. Other suggestions were made for it to be accessible after 3-6 months, supported by 25% of TLB survey respondents, while there were individual comments to have the data available continuously or annually.

Interviewees highlighted the need to balance the resources required to make the data available with the importance of ensuring the data is recent and reliable. For example, it was suggested that releasing the data between three months and one year after the survey would allow enough time for thorough analysis, while still being recent enough to support timely responses to any issues raised.This was noted as being more important for local-level data, if the survey results are reported at that level.

When asked, ‘Is there a time of year in which having access to the results of the survey would be most beneficial?’, the majority (50%) of TLB survey respondents said no, 21% said yes, and 28% said they don’t know.

Breakdown of data

Data users noted their desire to access as much data from a MHES survey as possible and had different preferences on the type and extent of data breakdown. There were calls for data to be available by demographic, and service type. For example, service providers were expected to want to access data at their individual service/area level, in addition to a national level. It was suggested that service providers may need full or partial support with the analysis at this level, either to reduce staff burden or because they may not have the skills or resources to do it themselves.

There was no clear consensus among TLB survey respondents on the most useful geographical-level breakdown of a MHES survey data. Just over half (54%) felt service level data would be useful, 50% at the NHS Board level, 42% at the HSCP level, 34% at the national level, and 3% selected ‘other’. While there was no consensus, this demonstrates a greater desire for more granular detail than national-level results.

Various considerations for reporting on demographic variables were described. These included differences linked to protected characteristics and socioeconomic status, for example, analysing the results using the Scottish Index of Multiple Deprivation, health outcomes and service usage. There were mixed views on the potential or need to analyse by other variables, such as homelessness data by local authority, name, date of birth, health board or CHI number.

While interviewees supported making the data publicly available, some raised concerns about publishing findings at regional or local levels. For example, one interviewee noted that creating league tables would not be helpful, while others highlighted risks such as data being skewed or individuals or services being identified from very small datasets. However, the benefits of using localised data to improve individual services were also recognised. These discussions included calls to carefully consider what data should be used internally versus what should be reported externally. Examples from the evidence review: Of the surveys in the review where information on reporting was available, 16 reported at a national/state/province level, 13 at a local/district/board/trust level and 7 at an individual setting level.

For example, the Community Mental Health Survey by the CQC has an overall summary report, a report exploring variations in results by NHS Trust, and a benchmark report provided to each NHS Trust. The CQC Adult Inpatient Survey in 2023 published reports, open data and benchmark reports for each individual trust. The Scottish Government’s HACE Survey has produced a national report, interactive dashboards showing local-level results and trends, and a spreadsheet showing results at a national level and breakdown by health board and GP cluster/level.

Examples of data reporting, dissemination and uses

Interviewees who had conducted service user experience surveys detailed their experiences of data dissemination and reporting. One interviewee released a national report with key findings, which was supported by an Excel dashboard allowing for further data breakdown. In a time-limited evaluation, one interviewee produced a report alongside tailored PowerPoint presentations to key stakeholder groups, such as noting areas of unmet need for policy teams. For a healthcare survey, regular reports were produced and shared with government departments, health boards and relevant charities, using the findings for policy, strategy and future planning.

Contact

Email: socialresearch@gov.scot

Back to top