Extended Distress Brief Intervention Programme: evaluation

This evaluation covers the period from May to December 2020 and focuses on the extended DBI programme. It provides insight into the effectiveness of the DBI service during a global pandemic.

This document is part of a collection


3 Methods

The evaluation ran from May to December 2020 and was based on information from telephone interviews with DBI recipients and practitioners, and routinely collected DBI data.

3.1 Recruitment and sampling

We conducted individual semi-structured telephone interviews with three groups of participants: individuals who received a Level 2 DBI intervention (n=20), Level 1 practitioners (n=20) and Level 2 practitioners (n=19). (We initially conducted 20 interviews with Level 2 practitioners, but the sound quality in one interview resulted in it being inaudible).

The DBI services helped us recruit participants for the evaluation. At the end of Level 2, DBI practitioners gave individuals information about the evaluation and asked them to contact the evaluation team if they wished to be interviewed. The DBI leads within the NHS24 Mental Health Hub and Level 2 services shared invitations to participate in the evaluation with all staff who managed and provided DBI interventions.

We used a sampling framework to ensure that a broadly equal number of practitioner interviews were undertaken across all 3rd sector agencies (See Table 1) and their relevant geographical areas.

Individuals who were supported by a Level 2 service during the evaluation period were invited to participate in a semi-structured interview. While the sampling strategy was largely convenience-based we endeavoured to ensure there was range of participation from different NHS Health Board areas and across the different Level 2 agencies.

Table 1. Overview of interviews conducted
Participant category Interviews conducted
Individuals using the DBI service 20
NHS24, Level 1 20
Total Level 2 provider interviews 19
LAMH 2
Lifelink 3
Penumbra 4
SAMH 4
Support in Mind 3
TRFS 3

Data collection

3.1.1 Qualitative data collection and analysis

Interviews with people who had received DBI explored issues such as the perceived impact of DBI on their levels of distress, the acceptability of delivering DBI over the telephone and virtually using the NearMe video consulting service and their views of what worked, as well as any suggestions for how the experience of DBI could be improved at each stage. Practitioner interviews explored key issues that might impact on successful implementation of DBI including training, referrals, staffing and resources, and the challenges and adaptations to local delivery within each context.

Given physical distancing restrictions and the requirement of home working, all interviews were by telephone (n=58) or using Microsoft Teams (n=1). (Participants were given the option of whether they preferred Microsoft Teams or telephone and the majority chose to be interviewed over the phone.) Interviews were audio-recorded (with permission), transcribed and entered into a qualitative data analysis package (QSR NVivo (v12)) for analysis.

Qualitative data were analysed by two researchers using a structured approach involving multiple close reading of all interview transcriptions and coding text according to a structured framework. We used a case study approach (Yin 2013) and drew on techniques of framework analysis (Ritchie & Spencer 2002). Analysis was guided by the Consolidated Framework for Implementation Research (Keith et al. 2017), which lists key factors that contribute to effective or unsuccessful programme implementation, including acceptability, characteristics that facilitated effectiveness and suggestions for improvement. The framework analysis linked closely to evaluation objectives, especially concerning the impact of DBI on individuals' distress, as well as broader questions around process and delivery of the intervention.

We use quotes throughout the report to illustrate points made. All quotes are anonymised and assigned an identifier to ensure no one is directly identifiable. If an identifier is labelled L1 it comes from a Level 1 provider. If it is labelled L2 it comes from a Level 2 provider. Individuals using the service were assigned a unique 6 digit number followed by the letter U (to identify them as a user of the DBI service). The pilot sites are also given an identifier to minimise the chances of them being identifiable.

3.1.2 Quantitative data and analysis

The evaluation team analysed aggregate routinely collected DBI data (from the period May to December 2020), provided by Public Health Scotland. The routinely collected DBI data included information about the people who accessed the DBI service, where they live and how socially deprived an area it is (calculated using the Scottish Index of Multiple Deprivation (SIMD) - a relative measure of deprivation based on the geographical location an individual lives in. SIMD quintile 1 = most deprived, while 5 = least deprived). Gender is reported as binary (male or female) as that is how the data was provided from Public Health Scotland.

Participant distress was measured using a 10-point scale called the Distress Thermometer, where 0 = no distress and 10 = extreme distress. DBI practitioners used the Distress Thermometer with participants at three time points: at Level 1, and at the start and end of the Level 2 intervention. Further routine data collected by DBI Level 2 practitioners included the length and intensity of individuals' involvement in their Level 2 intervention, the impact of DBI on individuals' level of distress, and individuals' views on the Level 1 and Level 2 service received. Quantitative data analysis was undertaken using MS Excel. Analysis involved undertaking descriptive statistics and crosstab analysis. Where data was categorised, analysis across categories was performed in order to explore and better understand the role of different factors in the findings. Data presented in Appendix 1 highlights key contributory factors that seek to explain the findings of the study.

3.2 Ethics and Data Protection

We collected all interview evaluation data following informed consent. Protocols were developed to provide individuals with support should they become distressed during the interview process. All study documentation (including the evaluation protocol that describes how the evaluation was conducted, interview topic guides, information sheets and consent forms) were approved by the University of Stirling General University Ethics Committee. Appropriate data release forms for the aggregate data collected were approved by the Data Protection Team at Public Health Scotland. As mentioned earlier, we have also taken steps to ensure participant anonymity in this report.

3.3 Strengths and limitations

In this rapid 8-month evaluation, 59 qualitative interviews were conducted across DBI providers and individuals; and high-level aggregate data was collected around use of the service. The availability of aggregate data meant that we were unable to re-categorise age groups in order to remove those under 18 years from our analysis. Unlike the main DBI pilot evaluation, no cost-consequence analysis was performed and the study only had access to distress data and not further outcome data as gathered in the main evaluation. Beyond the limitations of the study due to the nature of the data collected, there are several other limiting factors to consider.

The changing nature of COVID-19 related restrictions and variations[3] in concern around the virus meant the impact of COVID-19 was not uniform, not only across individuals receiving DBI but across timeframe for this evaluation. Qualitative data was collected between August and December 2020, when restrictions across regions in Scotland varied over time. It is feasible that an individual interviewed in August 2020 may have felt the impact of the pandemic differently in December 2020, for example.

During the evaluation, the NHS24 Mental Health Hub undertook a rapid expansion of staff. Approaching half (40%) of practitioners interviewed were new (less than 6 months in post) and so were interviewed while still being inducted and gaining confidence in their new role. Consequently, in many Level 1 practitioner interviews, it was not possible to compare Level 1 provision with support provided by NHS24 prior to the introduction of DBI.

In the main DBI pilot evaluation, interviews with individuals who had used the service were conducted at least 3 months after their last DBI intervention. Due to the short time scale of this evaluation this was not possible and interviews were conducted immediately post-intervention. This was beneficial in that interviewees had their experience fresh in their mind and were able to recall details of their interactions and thoughts on the support they received. However, there were also disadvantages as some individuals were still experiencing a degree of distress or anxiety. In addition, the lack of time between completing DBI and participating in the evaluation interview meant it was not possible to assess the extent to which the telehealth DBI approach supported people to deal with future distress. Finally, as only aggregate service level data was available to the evaluation team, this limited the analysis that was possible. For example, it was not possible to undertake analysis of the impact of the intervention at an individual level or to analyse the effect that different factors (e.g. age, SIMD etc) had on the effectiveness of the intervention.

Contact

Email: socialresearch@scotland.gsi.gov.uk

Back to top