Distress Brief Intervention pilot programme: evaluation

This report presents a realist evaluation of the Distress Brief Intervention (DBI) programme. DBI has been successful in offering support to those in distress, and has contributed to peoples’ ability to manage and reduce their distress in the short term, and for some in the longer term.

This document is part of a collection


3 Methodology and research methods

This mixed-method evaluation of DBI combined analysis of administrative DBI data, quantitative pre, post and follow up surveys with individuals who received a DBI, qualitative research with DBI practitioners and individuals who accessed DBI, and an economic evaluation. We adopted a realist evaluation approach (Pawson & Tilley 1997) to explore both the way that the intervention was delivered and understood and the extent to which it worked as intended (Cresswell et al. 2011). Our research design was chosen to fit with the specification criteria of the Scottish government DBI evaluation tender. This did limit some of the questions we were able to answer. Identifying a meaningful control group for a DBI study was identified in the Evaluability Assessment as highly challenging, and potentially unethical (NHS Scotland 2017). This meant that a controlled trial of DBI was not viewed as feasible. Consequently, while the selected realist evaluation design enabled an in-depth evaluation, the questions and scope of the evaluation in describing the effectiveness and the health economic analysis were limited.

Analysis of the service usage of individuals that were referred to the DBI programme was not possible at it was outwith the scope of the funded evaluability study. This means we are unable to draw any firm conclusion on what impact the DBI programme has had on NHS service usage. We also surveyed agencies that individuals who completed DBI had been referred to, but despite sending reminders we gained very few responses. At the beginning of the evaluation, we piloted a novel mobile phone method of collecting data from individuals who had refused referral to DBI Level 2 but did not progress with this arm of data collection as we were unable to gain any respondents.

The evaluation team worked collaboratively with DBI sites and the DBI Central Team to inform the development of procedures to:

  • Identify and access existing data collection and reporting processes.
  • Map out core elements of the DBI service system and regional variations.
  • Gather views on DBI practitioners and management's needs and expectations from the evaluation.
  • Agree on processes and tools for evaluation data collection.

Throughout the evaluation, the study team met together and with DBI practitioners, managers and stakeholders at DBI Gatherings and DBI Programme Board meetings and the DBI Level 2 Providers Forum (see Glossary of Terms) to share insights from our analysis.

We gathered data from the following sources:

  • DBI Level 1 and 2 practitioners (on training and implementation)
  • Individuals accessing DBI (on experience and impact)
  • DBI routine activity data (on individual characteristics and service use)
  • Agencies referred to by DBI (on appropriateness and outcomes of referrals and impact on services)

Evaluation data collection began on 1st January 2019 and was planned to continue until 30th May 2020. Data collection from service users was suspended in April 2020 due to COVID-19 restrictions.

3.1 Qualitative data collection

We collected data through semi-structured, face-to-face or telephone interviews and face-to-face focus groups (Appendix 1). Participants for staff focus groups and interviews were selected according to a convenience sampling framework, in which we endeavoured to recruit similar numbers of participants according to their role and geographical location. We were unable to further sample according to gender or age due to low numbers of eligible participants and low levels of agreement to participate in data collection. The breakdown of participants is presented in Table 3.1. Despite working hard to interview similar numbers of participants by role, Police Scotland are overrepresented and Primary Care professionals are underrepresented.

We also collected qualitative data from open-ended questions in the surveys (described below in the section on quantitative data collection).

Table 3.1: Qualitative interview and focus group overview

Group: Level 1 Frontline Service Practitioners

No. of participants

  • 43
  • (37 people in 8 focus groups; 6 individual interviews)

No. by role

  • SAS: 4
  • A&E/MH Crisis Teams: 18
  • Primary Care: 4
  • Police: 17

No. by site

  • Grampian: 7
  • Highland: 17
  • Lanarkshire: 14
  • Borders: 5

Group: Level 2 Practitioners

No. of participants

  • 26 (individual interviews)

No. by role

  • LAMH/ Lifelink/ TRF: 14
  • Penumbra: 3
  • SAMH: 4
  • Support in Mind: 5

No. by site

  • Borders: 4
  • Grampian: 3
  • Highland: 5
  • Lanarkshire: 14

Group: Individuals referred to DBI

No. of participants

  • 19 (individual interviews)

No. by role

  • N/A

No. by site

  • Borders: 2
  • Grampian: 3
  • Highland: 4
  • Lanarkshire: 10

Group: Service leads

No. of participants

  • 7 (individual interviews)

No. by role

  • NHS: 4
  • Police: 1
  • 3rd Sector: 2

No. by site

  • N/A

3.1.1 Interviews and focus groups with professionals

We held interviews and focus groups with a wide range of practitioners involved in delivering DBI, including representatives from all Level 1 services, Level 2 practitioners and DBI service leads (national and local DBI service managers). We also conducted a small number of interviews (either on the telephone or face-to-face) where this was more convenient or appropriate for the practitioners involved. Interviews and focus groups explored key issues that might impact on successful implementation of DBI including training, referrals, staffing and resources, and the challenges and adaptations to local delivery within each context.

3.1.2 Interviews with individuals accessing DBI

Individuals who participated in DBI and had been referred to Level 2 services took part in telephone interviews (n=19) between November 2019 and March 2020. These interviews explored their experience of DBI from the incident that led to referral at Level 1 through to their experiences of Level 2 and the referral process. Information was sought on the perceived impact of DBI on distress, interaction with professionals and the participants' views of what worked, as well as any suggestions for how the experience of DBI could be improved at each stage.

3.2 Quantitative data collection

The data collection tools were designed in consultation with DBI practitioners to ensure they were brief, appropriate and would not interfere with DBI practice. The evaluation team worked with DBI Central to ensure that no data were collected twice. Details of data collection tools can be found in Appendix 2.

3.2.1 Data collection from individuals using DBI

The quantitative data collection captured the experience of the DBI programme and its impact (at DBI Levels 1 and 2, repeated at 3 months following the end of Level 2), considering both individual characteristics and circumstances as well as other demographic, geographic and service-based contextual factors. We collected survey data from individuals between 1st January 2019 and 31st March 2020 via paper or online surveys, as people moved through the DBI pathway. The surveys covered their experience of accessing and using the DBI service and the impact of this on them, using questions specifically designed for this study and validated tools. Further information on the outcome measures (the Distress Thermometer (Mitchell, 2007), CORE-OM 5 (Evans et al., 2002) and CARE Measure (Mercer et al., 2004)) that are referenced throughout this report is provided in Appendix B. We linked the survey data for each individual to their routine DBI data collected by DBI practitioners. The linkage created a rich dataset that enabled a complex analysis of the factors contributing to individual outcomes.

Table 3.2 Quantitative data collection overview

Group

Data collected

N

Timescales

Individuals receiving Level 1 and Level 2 DBI who participated in evaluation

First Level 2 session survey

Final Level 2 session survey

Level 2 3-month follow-up survey

DBI routine data for linkage to individual survey data

575

499

102

499

1st January 2019 to 30th April 2020.

All individuals referred to DBI from 1st January 2019 to 30th April 2020 were eligible to participate. This data collection was originally planned until 31st May however due to Covid-19 the deadline was brought forward.

DBI practitioners delivering Level 1 and Level 2 DBI

Online survey

Level 1 practitioners

Level 2 practitioners


172

29

4th March 2020 to 22nd March 2020

All individuals receiving Level 1 and Level 2 DBI

Aggregate routine monitoring data from those referred to DBI.

5316

1st January 2019 to 30th April 2020

Agencies referred to by DBI level 2 practitioners

Online survey

9

21st November 2019 to 17th December 2019

3.2.2 Aggregate routine DBI data

NHS Scotland provided the evaluation team with routine DBI data (captured by Level 1 and Level 2 DBI practitioners) on all individuals accessing DBI between 1st January 2019 and 30th April 2020 in pseudonymised aggregate form (that is non-identifiable summary data).

3.2.3 Aggregate Level 1 practitioner training evaluation data

The University of Glasgow developed a brief evaluation of practitioners' confidence to deliver Level 1 to be completed immediately before and after training. NHS Health Scotland provided the evaluation team with confidence ratings for a total of 997 frontline practitioners (including police, ambulance service, A&E, Primary Care, Social Work and community and crisis mental health team staff) who were trained between October 2017 and December 2020.

3.2.4 DBI practitioners' survey

Level 1 and Level 2 DBI practitioners were invited to complete a survey focusing on their DBI training, skills and confidence.

3.2.5 Survey of agencies referred to

We surveyed agencies to which individuals were referred to by DBI to examine the appropriateness of referrals, engagement with individuals, perceived outcomes and the overall impact of DBI on other agencies in terms of demand and joint working relationships. Despite sending reminders we gained very few responses.

3.3 Data analysis

The analysis drew on a convergent mixed methods approach where analyses are merged into a single narrative, drawing on different datasets as appropriate. The qualitative and quantitative datasets were analysed separately then results were merged (where possible), guided by research questions. This included the results of the health economics analysis (see Section 3.4), which were also merged into an overall narrative.

3.3.1 Qualitative data analysis

We had all the audio-recorded interviews and focus groups transcribed and entered into QSR NVivo (v12), a qualitative data analysis computer software package, to support analysis. We analysed our qualitative data using a case study approach (Yin 2013), drawing on techniques of framework analysis (Ritchie & Spencer 2002). Analysis was guided by the Consolidated Framework for Implementation Research (Keith et al. 2017), which lists key factors related to implementation that contribute to effective or unsuccessful programme implementation, including acceptability, characteristics that facilitated effectiveness and suggestions for improvement. The framework analysis linked closely to research questions, especially concerning the impact of DBI on individuals' distress as well as broader questions around the process and delivery of the intervention.

We coded the qualitative data collected via survey open questions separately and synthesised these findings with the information gathered from the larger body of qualitative findings.

3.3.2 Quantitative data analysis

We inputted data from paper surveys electronically and securely downloaded data from online surveys to SPSS and/or MS Excel for analysis following editing and data cleaning in line with ScotCen's Quality Management System. Quantitative data analysis consisted of descriptive statistics and crosstab analysis. We conducted significance testing of the quantitative surveys using regression analysis in SPSS to determine whether there was a difference in the dependent variable in the mean score, or in the category of interest, between the separate categories of the break variable. Where a significant difference is discussed in the text, this is significant at the 5% level. It should be noted that these significance tests are for guidance only as the surveys were not based on random samples. We could not do any significance testing on the aggregate routine data supplied by Public Health Scotland because access to the raw datasets was beyond the scope of our agreed evaluation remit.

It should be noted that the sample of participants who completed outcome measures at the end of Level 2 (n=499) was 14% of the overall number of people who took up DBI. However, the demographic profiles of all referrals to Level 2 and the sample of individuals participating in the evaluation (based on gender, age, area deprivation measured by SIMD and distress thermometer score at Level 1) were similar.

3.4 Economic analysis

The economic analysis presents the costs and outcomes of the DBI programme components in a form similar to that of a cost consequence analysis (CCA). This is a type of economic evaluation where disaggregated costs and a range of outcomes are summarised together in a 'balance sheet' table (Drummond et al. 2005). It was not possible to conduct a full economic evaluation because of the absence of a relevant alternative to which DBI could be compared. To answer questions about value for money a comparator would be required. The inclusion of a comparator group in the evaluation study design was specifically excluded from the Scottish Government DBI evaluation tender specification. The economic analysis of the DBI programme's resource use, associated costs and outcomes provides useful information for people involved in planning, implementing, establishing or maintaining DBI services.

Due to the nature of the intervention and the available data, a public sector payer perspective was taken in the economic analysis. Adopting this perspective meant that only costs that fell into the public sector were considered. The evaluation did not consider personal costs, such as absence from place of employment, or other societal costs, as these were outside the scope of the evaluation project. Given the importance of centralised management of such a diverse and large government initiative, the focus of the economic analysis was on both the pilot areas and DBI Central.

We obtained data on the resources required for the delivery of DBI Central activities and the Level 2 providers from DBI Central. We obtained data on the resources required for the development and initial rollout of Level 1 and Level 2 training from the University of Glasgow. Using these data, we calculated the annual cost for 2019-2020 for each pilot area and DBI Central costs, which are presented separately. Further detail on the methods, unit costs and data analysis are presented in Appendix C.

3.5 Ethical approvals

Approval for the DBI evaluation study was provided by the West of Scotland Research Ethics Service in September 2018, with further amendments approved in June 2019 and October 2019. The Health and Social Care Public Benefit and Privacy Panel[1] granted approval for the data linkage element of the study from November 2020. We collected all evaluation data following informed consent (Appendix 2). Protocols were established to support any individuals who became distressed during the evaluation.

Contact

Email: socialresearch@gov.scot

Back to top