Domestic Homicide Reviews: evidence briefing

This evidence briefing compares the Domestic Homicide Review model of 17 international jurisdictions. It aims to inform the initial stage of thinking around the development of a Domestic Homicide Review model for Scotland


5. Reporting, Monitoring and Measuring Impact

A DHR will include recommendations to improve the system's response to domestic homicides. Table 4 in Annex 1 gives an overview of how each jurisdiction reports on the findings and recommendations of the DHR, and how progress is monitored.

5.1 Types of reporting

Key findings:

In most jurisdictions reports were written either for each case, annually or biennially. When reports include a number of case reviews, data is usually aggregated and the report discusses trends rather than individual homicides.

All reports include recommendations. The literature suggest that recommendations are often targeted at individual agency level, and recommendations that focus on relationships between agencies and community responses could be more widely utilized.

The academic literature highlights that a common dataset would ensure systematic collection of data and learning. The literature also suggests there is a need for more consistency in reporting.

The 17 jurisdictions show that reports can be written for each case (e.g. in England, Wales, Northern Ireland and Portugal), annually (e.g. Western Australia, Ontario, Maryland and Delaware) or biennially (New South Wales, Montana and Washington). Sometimes reporting is more ad hoc (e.g. British Colombia), or, for example in Saskatchewan, Canada, there is one report presenting all findings of the pilot DHR, analysing cases from 2005-2014. In some of the jurisdictions the reporting is required by and presented to parliament, or parliament committees, for example in New South Wales (Australia), Colorado (USA), Montana (USA) and Vermont (USA).

When reports include a number of case reviews, data is usually aggregated and the report discusses trends rather than individual homicides. Often reports also include a section on the victims, which comprises of either a message to commemorate them or the inclusion of all the names of those that were killed. As noted in the previous section, many reviews report aggregated data and learning to protect the privacy of those involved. This could, however, prevent a victim's story being told, putting an emphasis on "counting" over "memorialisation" (Rowlands 2020a).

In England and Wales the DHRs are implemented by Community Safety Partnerships at a local level. For every DHR conducted, the Community Safety Partnership will produce a report. The recommendations of the report are therefore focused on the local context, with for example specific suggestions for the police, local authorities, domestic abuse services or health services.

New South Wales (NSW), Australia, reports biennially to the NSW parliament, with more general recommendations derived from both individual case analysis and wider quantitative data analysis. The recommendations cover legislation, policies, practices and services, covered by themes that were identified in the data. Every report is followed by a response from the NSW government, addressing these recommendations.

In New Zealand the committee reports regularly, with each report reflecting on the themes discussed in the previous ones, and adding to the body of knowledge. Early reports focused on how individual agencies or components of the system responded to cases, while more recent reports reflect on wider systemic processes or structures that work to reinforce the violence experienced. Each report provide recommendations to improve the system and agency responses, but they are not directed to individual agencies and are seen as applying to all agencies.

The academic literature highlights the need for a common data set, especially when DHRs are undertaken locally, to ensure the systematic collection of data and learning. This allows data, as well as recommendations from multiple case reviews, to be aggregated. Rowlands (2020a) points out that the absence of a national standardized data collection can lead to a lack of understanding of wider (country-wide) trends, as well as lack of suggestions or engagement with recommendations that should be implemented on a national level. This national collection could include a central repository to hold all reports, as Wales has for example implemented, as well as an overarching body that can collate reviews and synthesize and disseminate learning (Robinson et al. 2018).

All reports of the 17 jurisdictions covered in this briefing have recommendations listed, some might be more general (drawn from multiple cases), while others might address specific cases and specific agencies. Haines-Delmont et al. (2022) point out that it is important to formulate actions that are 'specific, tangible, achievable and realistic'. Jones et al. (2022) add that recommendations are often targeted at individual agency level. They argue that a focus on relationships between agencies and recommendations to address these could be a helpful inclusion. Additionally, their study highlighted that recommendations that target a community response are currently underutilized (Jones et al. 2022). These type of recommendations would be valuable, especially as analysis has shown that "the room for error seemed to increase when boundaries are 'crossed' or where there is a transition between one type of service user to another, from one service to another, or from one geographic area to another" (Robinson et al. 2018 p. 5).

The literature also addresses the need for consistency in reporting. Studying the Welsh DHR process, Robinson et al. (2018) noted that the quality and scope of reports often differed markedly, with some reports "of far better quality in terms of their level of detail and analysis than others and writer of reviews would benefit from guidelines, training and a consistent standard and benchmarking" (Robinson et al. 2018 p. 12).

5.2 Implementing review recommendations

Key findings:

In some jurisdictions annual reports address recommendations from previous reports and how they were followed-up. Often, however, it is not clear who is responsible for implementing recommendations, or how they are addressed and used in practice.

Policy development is likely to remain a challenge when there is no clear mechanism for monitoring the implementation of recommendations.

There is a lack of information on whether and/or how recommendations are implemented and evaluated (Rowlands 2020a, Jones et al. 2022). Bugeja et al. (2015) show in their review that only about a third of the domestic homicide reviews examined reported changes had occurred in service systems as a result of the recommendations made in the review process. Moreover, they showed that of 35 DHR models that made recommendations, only seven of them had specific mechanisms for monitoring actions taken and outcomes achieved. These mechanisms included: recommendations being assigned to an appropriate member of the DHR team who takes the recommendation to the agency that is capable of responding; a mandatory response regime in which recommendations are tracked by the DHR team members for completion; a focus on following up on recommendations made in previous years when no new DHRs were conducted in that year; and a symposium to synthesise and prioritise previous recommendations and develop a strategic plan for implementation (Bugeja et al. 2015).

In some jurisdictions annual reports address recommendations from previous reports and how they were followed-up. In New South Wales, the government publishes a response to the recommendations made in each biennial report. Often, however, it is not clear, who is responsible for implementing recommendations, or how they are addressed and used in practice.

Policy development is likely to remain a challenge when there is no accountability or mandate to respond to recommendations or to develop a mechanism for monitoring their implementation (Bugeja et al. 2015). The literature suggests that it is important to "make the shift from prioritization to implementation of recommendations" (Jones et al. 2022 p. 11) and organisations might need incentives to implement recommendations including support and training. In England and Wales for example, there is a statutory requirement to carry out a DHR, but no statutory requirement to report on whether recommendations have been implemented (or what the barriers to implementation are) (Jones et al. 2022).

5.3 Evaluation and evidencing impact

Key findings:

It is difficult to evidence impact and attribute changes to DHRs alone. Moreover, little information was found in the literature on evaluation processes in the 17 jurisdictions covered in this briefing.

The literature highlights that there are several common themes in the recommendations made in DHR reports and these themes are often repeated in consecutive reports, which raises the question of what the impact of DHR recommendations is in practice.

Suggestions from the literature to deliver impact are to clearly articulate the purpose, aims and processes of a review and ensure consistency between reviews is established. A DHR should be viewed as a continuously evolving practice that includes auditing, monitoring and evaluating recommendations and overall impact.

There is a consistent challenge in evidencing impact of DHRs. It is difficult to make causal links between actions taken as a result of a DHR and homicide figures (Beguja et al. 2015, Jones et al. 2022). It may be more useful to focus on whether reviews lead to organisational change (Rowlands 2020). Websdale (2020) suggests that one common outcome of review work is "an increase in social networking, communication, coordination, and collaboration among those agencies and stakeholders handling domestic violence cases" (p. 15).

It is, however, difficult to attribute changes to the DHR work alone as reviews take place in a context with many other possible influences. Evaluating a coordinated community response is difficult "due to the complex and localised nature, as well as different understanding of what constitutes 'success'" (Jones et al. 2022 p.11). Haines-Delmont et al. (2022) argues that to ensure impact can be understood and reported upon, the purpose, aims and processes of a review should be clearly articulated and consistency between reviews established. One way to do this is by developing shared concepts and clearly define the theories and terms that underpin the review process. Moreover, in Haines-Delmont et al. (2022) study, professionals suggested that a DHR should be a continual process of 'evolving practice' and include auditing, monitoring and evaluating recommendations.

In terms of evaluations, there often does not seem to be a systematic evaluation process in place once a DHR is established. In their study, Bugeja et al. (2017) did not find any independent evaluations that looked at the effectiveness of a DHR process. They point out that with the absence of any evaluation of best practice, there is a lack of useful guidance to support newly forming DHR teams.

It is unclear from the reports and available information online whether the 17 jurisdictions covered in this briefing have evaluative processes in place to review their approach to DHRs. An exception is Wales, which has conducted an evaluation to support the development of a more holistic review process (see Robinson et al. 2018). Moreover, there are some academic research projects (in some cases commissioned by government or DHR review boards) that have formulated suggestions for improvements for specific countries (see Rowlands (2020) for a review of the approach in England). It might be the case that evaluations are carried out internally or are ongoing, and no documentation is (yet) publicly available. For example, the Home Office guidance, which has been in place for 11 years, is currently under review with the revised guidance anticipated to be published in 2023.

The literature highlights several common themes in the recommendations made in DHR reports[13]. Often these themes are repeated in consecutive reports. For example, the recommendation to provide professional training for staff of services coming in contact with victims of domestic abuse has been a common recommendation for over a decade (Jones et al. 2022). Moreover, there is a substantial body of literature that has identified risk factors and interventions to prevent domestic homicides (for an overview, see Kim and Merlo 2021). This raises the question of what the impact of recommendations from DHRs is, whether further DHRs will provide any new insight, or whether substantial systemic change is needed (Haines-Delmont 2022).

The value of a DHR, however, is not only in the recommendations it provides. It also creates the opportunity for practitioners to make new connections with (local) partners and directly address any flagged issues in their organisation. In Haines-Delmont et al. (2022) research professionals in Wales described the learning event (where lessons learnt are shared) as powerful in improving practice.

Contact

Email: Justice_Analysts@gov.scot

Back to top