Tackling child poverty pathfinders: evaluability assessment

An evaluability assessment of the Child Poverty Pathfinders in Glasgow and Dundee to inform the development of an evaluation plan for the Pathfinder approach. Includes an evaluability assessment report and accompanying theories of change and initial monitoring framework to support evaluation.


What are the methodological options for evaluating the Pathfinders?

Following from the aims above, we now provide recommendations for the best methodologies to address each in turn.

Methods for evaluating the impact of the Pathfinders

Understanding impact evaluation

Howard White (2006) defines impact evaluation as an assessment of the impact of an intervention on final welfare outcomes. White notes, however, that impact evaluation has taken on several different meanings over the last decades. Two common understandings of impact evaluation include:

An evaluation which looks at the impact of an intervention on final welfare outcomes, rather than only at project outputs, or a process evaluation which focuses on implementation;

An evaluation concerned with establishing a counterfactual, i.e. the difference the project made (how indicators behaved with the project), compared to how they would have been without it.[12]

These definitions broadly guide the different types of methodologies used to assess impact, which will be discussed below. However, they are not necessarily mutually exclusive.

The following section will discuss the methodologies in more detail. It is worth noting that impact assessment has grown as a field and methodologies have emerged and become more refined over time as debates in the field become more nuanced.

Given the underlying aim of the Pathfinders to reduce child poverty, measuring the impact on families and child poverty is a crucial part of the evaluation. The most obvious way of doing this would be to look directly at child poverty rates in the Pathfinder areas. However, as highlighted in the literature review, there are many complications around both measuring and understanding the causes of change in child poverty levels. Therefore, we see there being three options for measuring the impact of the Pathfinders on families and child poverty:

Analyse population-level child poverty data (i.e. measures of absolute or relative poverty) and determine whether the Pathfinders have had a positive impact on poverty rates.

Measure the changes in child poverty levels and the factors which directly drive this (i.e. income and costs of living) for the families who have been supported by the Pathfinder.

Look at the other positive outcomes experienced by the families which are likely to lead to changes in poverty (such as improved employment, health or educational attendance).

Under option 1, impact would be measured based on the number of children in the relevant geographical area[13] living in a household with an equivalised net disposable household income below 60% of the UK median. This measurement may be desirable given the wider context in which the Pathfinders sit – i.e. reducing overall child poverty levels in Scotland as part of the Best Start, Bright Futures delivery plan. However, we do not recommend relying on this measure alone. This is because child poverty is a complex issue which depends on a vast number of different factors and only meaningfully shifts in the long-term.[14] Additionally, this is compounded by the relatively small scale of the Pathfinders which will make it even more challenging to measure changes in child poverty that can be attributed to this specific programme, even if the child poverty data assessed is restricted to the Pathfinders' specific regions. For example, the relatively light-touch intervention of the Glasgow Pathfinder is unlikely to have an identifiable impact on the child poverty rate across the entire city. This is corroborated by findings from the Welsh Child Poverty Strategy Evaluation, which found that for their population size there had to be large changes in rates of child poverty – at least 3% every year for 3 years – to be statistically significant.[15] Although that study related to poverty at a national level and so operates on a different scale to the Pathfinders, the scale of intervention is also much larger. Therefore, if the scope of both the poverty data analysed and the intervention are scaled down to the level of the Pathfinders, it is possible that the same conclusion may apply.

By contrast, options two and three are more feasible to measure, but with the trade-off that they are a degree of abstraction away from the specific child poverty targets. While they cannot provide evidence on the bigger picture, these options do offer a more detailed view on the exact ways in which families benefit from the Pathfinders and, if not directly measuring poverty, offer a useful proxy measure. This was the approach adopted by the Welsh Child Poverty Strategy Evaluation, which – for the reason above – opted to focus on factors other than child poverty itself.

As noted in our evaluation aims, alongside the impact on families and child poverty, the extent to which the Pathfinders have driven systems change should also form part of the impact evaluation. There is significant amount of literature on what constitutes systems change and how to evaluate this. Approaches best suited to the evaluation of complex public services should consider the following key implications of taking a complexity informed understanding of system behaviour:

  • Interventions must be understood within their context, and contextual factors must be taken into account when assessing impact
  • Programmes must make their assumptions explicit and spend some time framing issues to enable an evaluation to reflect the system
  • Evaluation approaches may need to change to be more problem-orientated, collaborative and inclusive of multiple kinds of evidence

Evaluation may best be embedded into an intervention and/or agency as a developmental process. When evaluation becomes a feedback loop in the system, it can be used to adjust and refine interventions as they develop. In this context, the role of the evaluator shifts from solely providing information to facilitating change and adaptive management.[16]

Previous studies also show that in order to overcome the complexities around measuring and evaluating systems change, the evaluation must firstly establish what constitutes success in relation to systems change, as well as defining 'the system' and its boundaries. Key evaluation methods previously used to assess systems change have been contribution analysis and mixed methods research, while 'action research' has also been highlighted as an effective way of making the evaluation adaptable and flexible to the complex systems.[17], [18]

Literature supports that there are three main areas where systems change can be evidenced: strategic learning; changes in the drivers, behaviours, or actors of the system; and changes in the outcomes of the system. Possible indicators of systems change include changes in the scale, quality, and comprehensiveness of pathways or changes in the way these pathways link different steps or are co-ordinated. Methods of determining if there has been changes in the drivers or behaviours of a system include social network analysis, outcome mapping, and outcome harvesting.

On the basis of our literature review (appendix 2) in relation both to impacts on families / child poverty and systems change, it appears that understanding impacts of this sort of intervention requires mixed method approaches in order to overcome the complexities of the topic. Therefore, quantitative analysis, such as measuring changes in income, cost of living and other indicative outcomes, should also be accompanied by qualitative data to allow for a deeper understanding of the impacts. Consistent with Magenta Book guidance, we will discuss two methods for evaluating impact in the context of the Pathfinders:[19]

Quasi-experimental approaches – these aim to provide statistically robust evidence on impact, and can be used as part of a mixed method study.

Impact assessment through a theory-based evaluation – this is a more flexible approach which can incorporate a variety of evidence-gathering techniques, but may yield less clear results.

Within both of these there are several different methodological options. In the following subsections we discuss their merits and suitability to evaluating the Pathfinders. In addition, we present two alternatives to these options which involve embedding ongoing evaluation and learning through a learning partner, or enhancing the quasi-experimental approach with case studies.

Quasi-experimental approaches

Quasi-experimental approaches use a counterfactual that is not created by randomisation (as compared to a randomised controlled trial) to evaluate the effect of an intervention. These approaches create a comparison group that is as similar as possible to the group who received the intervention, based on prior characteristics, meaning that changes in observed outcomes can be attributed to the intervention.[20] Therefore, quasi-experimental studies are useful in instances where individuals cannot be randomly assigned to treatment or control groups, such as when this would be unethical or logistically impractical.[21]

A quasi-experimental approach, if implemented properly, would be an effective way of simultaneously measuring the scale of effects of the Pathfinders on supported families, and determining to what extent these effects are directly attributable to the Pathfinders' support. In general terms, a quasi-experiment would measure how outcomes for families who received Pathfinder support (the treatment group) changed before and after the intervention, and compare the same outcomes at the same points in time for a comparable group of families who did not access the Pathfinders (the control group). If a statistically significant difference between the two groups is found, then this could be attributed to the impact of the Pathfinders.

Here, we set out the main methdological options for quasi-experimental approaches, and make an assessment of whether they would be suitable for evaluating either the Glasgow or Dundee Pathfinders, based on the requirements for each approach and the features of the Pathfinder models. We focus on four common quasi-experimental methods: propensity score matching (PSM), difference-in-difference (DiD), regression discontinuity analysis (RDA), and interrupted time series analysis (ITSA). We do not explore the possibility of using an instrumental variable regression as the existence of a suitable instrumental variable cannot be planned.

Overall, our assessment is that neither Pathfinder model is an ideal subject for any of the main quasi-experimental approaches. Notwithstanding this, for the Glasgow model an ITSA approach appears to be the closest fit, whereas for Dundee a DiD approach would be most suitable. Given the uncertainty of using these approaches to evaluate the Pathfinders, before rolling them out in full it may be valuable to conduct a smaller-scale feasibility study. Following our detailed discussion of the methodological options, therefore, we have set out the parameters for what a feasibility study might look like and what it would achieve.

Propensity Score Methods (PSM)

PSM allows for researchers to statistically construct a counterfactual group in order to evaluate the impact of a programme. A propensity score is the likelihood that an individual received the intervention, and it is calculated with observable characteristics which are believed to effect participation.[22] Individuals from both the intervention and comparison groups are matched or weighted on their propensity score, and then the differences in the outcomes can be calculated between these groups.[23]

PSM is beneficial because it can control for pre-programme characteristics of the sample for both intervention and comparison groups and can estimate the impact of a programme.[24] Additionally, PSM has been mentioned to be suitable to evaluations of anti-poverty programmes because it allows one to examine the difference in impacts of the programme based on pre-programme characteristics.[25]

However, PSM is most suited to evaluations where large datasets, such as administrative or local authority-level data, are available that include demographic and outcome data on both participants and non-participants.[26] It is important that all of this data, on both the intervention and the comparison groups, comes from the same source or is directly comparable.[27] Moreover, the matching must be done based on pre-intervention characteristics.[28]

Additionally, it is important to note that the intervention and comparison groups are only able to be matched based on observable characteristics that are in the dataset.[29] This means that if the outcome measured is also affected by unobserved variables, then the estimate of the impact of the programme may be biased.[30]

Applicability to the Pathfinders

The potential of using PSM to evaluate each of the Child Poverty Pathfinders will now be examined. The main challenge to using PSM to evaluate either of the Pathfinders is the availability of suitable data. For the Glasgow Pathfinder, due to their method of service delivery, where the participants voluntarily access the service through the phoneline, the main challenge would be the collection of suitable data on non-participants. Obtaining a comparison group of a suitable size to conduct PSM will present several challenges in terms of research ethics and resource use. It may be possible to explore natural variation in who does and does not accept the offer of holistic assessment to construct an exposed and non-exposed group. For example, this could compare people who call and receive immediate support like a food parcel, compared to those who accept the offer for holistic support. If data collected by the Glasgow Pathfinder was expanded to include outcomes and information to create propensity scores for these two groups, that would allow some examination of the benefits of holistic support versus not receiving holistic support. That said, this would not be an evaluation of the Pathfinder, rather an element of it.

The Dundee Pathfinder collects the 'Client Spreadsheet' data which includes data on the clients including number of families, children, and barriers to employment. In addition, through Housing Benefit and Council Tax data, it also has access to information on income before particpants enrolled onto the Pathfinder, as well as data on non-participants. This may make it possible to conduct PSM and assess changes in the income of households from before the intervention. The families that are included in this data, but who did not take part in the programme, could make a suitable comparison group. However, these datasets would also need to include information on several pre-intervention characteristics to make the matching robust. Further, consistent outcome data for both participants and non-participants would be needed in order to measure the impact of the Pathfinder. If the Housing Benefit or Council Tax data did not include all of the relevant data points on non-participants then, with the data currently held by the Pathfinders, PSM would not be feasible. In this case, the Pathfinders would need to make an application to access other administrative datasets to conduct PSM. However, it is presently uncertain exactly what data and variables would be required for the evaluation of the Pathfinders, and if this data is available through other administrative data. Overall, this makes it unclear, at present, whether PSM is possible. It is also worth noting that PSM is best suited to evaluations where large datasets are available. The Dundee Pathfinder, due to the relatively small number of families it supports, would also not have a large matched comparison group, which may limit the ability for PSM to provide statistically robust results.

Differences-in-difference (DiD)

DiD, allows one to evaluate the effect of a programme by looking at the after and before differences between the intervention group and the comparison group.[31] This allows researchers to examine the differences in outcomes between the intervention and comparison groups after the intervention took place. However, DiD rests on the assumption that the outcomes for both the treatment and comparison groups would remain parallel if the programme did not happen.[32]

To conduct a DiD, data on both the intervention and comparison groups before and after the intervention is required.[33] This can come from panel data or repeated cross-sectional samples.[34] By subtracting the before-after differences, DiD helps to control for non-observable factors which influence the outcomes.[35] However, to conduct this analysis there needs to be substantial amounts of quality data collected over a sufficient sample size.[36]

Applicability to the Pathfinders

Similar to the PSM, the main barrier to conducting DiD to evaluate the Pathfinders is the availability of suitable data for both participant and non-participants. Given the data sources that both the Glasgow and Dundee Pathfinder currently hold, which were discussed in the Data Audits, it is unlikely that either Pathfinder holds enough pre and post intervention data on both participants and non-participants to conduct a robust analysis. Even if a data sharing agreement allowed full access to administrative data, it is uncertain whether this would contain all of the information that is needed to conduct a robust DiD. Additionally, now that both Pathfinders are working with clients the opportunity to collect this data has been missed.

However, as previously mentioned, the Dundee Pathfinder has access to Housing Benefit and Council Tax data, which helps them determine which families to engage with. This data is collected before the intervention and thus could be used to measure the outcomes for families who were not chosen and those who were. In order to conduct the DiD the evaluators would need to continue to collect this data on an ongoing basis so the before and after comparison can be made. However, this also requires determining a cut-off point to create the 'after intervention' comparison data. Due to the operating model of the Dundee Pathfinder, where intensive support is given to families, and the duration of this support is dictated by the needs of the family, it may be difficult for researchers to establish these clear cut-off points needed for this method. One possibility would be to make this cut-off the average length of support or average number of support sessions.

Regression Discontinuity Analysis (RDA)

RDA is suitable for when membership to the comparison or treatment group is determined by a singular cut-off on a continuous scale (e.g. living below a certain income).[37] This threshold creates a discontinuity, and allows researchers to draw a comparison between those just above and just below the discontinuity.[38] RDA estimates the impact of the programme by looking at the difference in outcomes between those just above and just below the threshold.[39]

This method is useful for scenarios when membership in the treatment or comparison groups is not random.[40] However, a significant limitation of this method is that it only allows an estimation of the effect of the programme for those close to the threshold, as the effect may differ for individuals farther away from this cut-off point.[41]

Applicability to the Pathfinders

The requirement to have a defined eligibility cut-off makes this method inappropriate for the evaluation of the both of the Pathfinders, as there have not been suitable eligibity criteria defined on a continous scale to receive support in either Pathfinder. Additionally, one assumption of the RDA is that this cut-off cannot be the same as other programmes, which may pose complications for evaluation of a Child Poverty Pathfinder, as eligibility for benefits or other support services may be similar (e.g. postcodes) and could therefore confound the results.[42]

Interrupted time series analysis (ITSA)

In contrast to the previous approaches which require a comparison group, there are some quasi-experimental approaches that do not use a comparison group; one such method is ITSA. This method utilises time series data to test if there is a change in the outcomes after an intervention is in place, and is especially useful for evaluating population-level interventions.[43] ITSA uses the pre-intervention trend as the control period. This is then compared to the period after the introduction of a service. ITSA designs can be strengthened by the inclusion of a comparison group. The advantage of using a comparison group in an ITSA is that it can exclude time-varying confounders, such as other events occuring during the period of the intervention, as these can be difficult to identify based on modelling pre-intervention trends.

Applicability to the Pathfinders

To conduct ITSA, data needs to be collected for a considerable period both before and after the intervention. Due to the fact that neither of the Pathfinders collected pre-intervention data, this would have to come from administrative or national-level survey data.[44] This is an additional benefit of ITSA, as using administrative data means that an ITSA can be conducted after the intervention, even if no data was collected specifically around the intervention, as is the case with both Pathfinders.[45] However, as mentioned above, there is a risk that the specific data required for the analysis is not available in these datasets. It also must be considered if it is reasonable to assume they would meaningfully impact on child poverty outcomes within the specific time frame permitted by the evaluation funding.

The fact that ITSA does not require a comparison group means that, on the face of it, it has some appeal for use on the Glasgow Pathfinder, where obtaining a comparison group would be challenging. However, running ITSA without a comparison group relies on the assumption that no other intervention affects the sample during the intervention period. Because the Glasgow Pathfinder operates across the city, we would expect there to be overlap between the Pathfinder and other child/family poverty interventions, meaning this assumption is not met. In cases such as this, ITSA can still be conducted, but with the use of a comparison group. As previously noted, there are significant challenges associated with obtaining a comparison group for the Glasgow Pathfinder, meaning that although ITSA has potential merits, it still has significant limitations for use in Glasgow.

For the Dundee Pathfinder, conducting an ITSA may be more feasible. The Dundee Pathfinder collects the 'Client Spreadsheet' and Housing Benefit and Council Tax data. This data could be analysed to determine if there has been any statistically significant changes in these indicators in the postcodes where the Pathfinder is active. Otherwise, administrative data, including household surveys, could be used to conduct ITSA. Using administrative data will only allow researchers to examine the impacts on key indicators, that are available in the administrative data, for families in the study. However, the geographically-concentrated nature of the service, and relatively small numbers of families that are supported by the Dundee Pathfinder, may result in the sample size being insufficient to conduct analysis with statistically significant results, even if suitable administrative data is available.

Summary

In summary, the Glasgow and Dundee Pathfinders have different delivery models and data collection tools which make them more suited to different evalution methods. For Glasgow, the difficulties in obtaining a suitable comparison group make ITSA the most suitable quasi-experimental method for its evaluation. This is because ITSA can rely on administrative or national-level datasets and does not require a comparison group, making this the most feasible option for the Glasgow Pathfinder. However, if a suitable comparison group could be found the causal inferences evaluators could make would be much stronger.

As shown previously, the Dundee Pathfinder's delivery model makes it easier to collect data on a comparison group, but it also results in the Pathfinder working with fewer families, which diminishes the potential sample size. Therefore, PSM may not be the most suitable option as this is most suited to evaluation where large datasets are available. As a result of this, the Dundee Pathfinder could use DiD analysis to compare the outcomes of the participants and non-participants. This could involve using either the Housing Benefit and Council Tax data, or other adminsitrative data collected through a sharing agreement, and comparing how the outcomes between the two groups differ over time since the start of the programme.

Alternatively, if it was decided during a feasibility study that the evaluators could not collect enough data on non-participants to form a suitable comparison group, then ITSA could also be used for the evaluation of the Dundee Pathfinder. The process of obtaining a comparison group in each Pathfinder is further elaborated later in this section.

It is important to bear in mind that although suggestions are given for the most suitable quasi-experimental approach for each Pathfinder, this does not mean that a quasi-experiment in general is the most applicable method for evaluating the Pathfinders. Some limitations common to all quasi-experimental approaches in the context of the Pathfinders are discussed later in this section, and include challenges in accessing suitable data and the consideration that holistic support offered by the Pathfinders may not be effectively captured in quantitative datasets that were not collected for this purpose. Ethical issues for this approach are also discussed.

Feasibility study

As is clear from the discussion above, there are a number of challenges associated with quasi-experimental approaches, particularly in the setting of the Pathfinders. This gives rise to a high degree of uncertainty around the efficacy of this method should it be adopted. One way of mitigating this uncertainty is to run a feasibility study of the quasi-experimental approach before full roll-out. The UK Medical Research Council states that feasibility studies "should be designed to assess predefined progression criteria that relate to the evaluation design (eg, reducing uncertainty around recruitment, data collection, retention, outcomes, and analysis) or the intervention itself (eg, around optimal content and delivery, acceptability, adherence, likelihood of cost effectiveness, or capacity of providers to deliver the intervention)."[46] In light of this, it appears that if a quasi-experimental approach is pursued following this evaluability assessment, a feasibility study would be an appropriate first step.

A feasibility study would involve running a pilot, or test, of the evaluation on a small scale to assess whether each of the components necessary to carry out a quasi-experimental approach are in place. It would not seek to yield statistically significant results or establish whether and how the Pathinders had an impact, but rather is designed to troubleshoot all the different elements. If there are obstacles to completing the feasibility study, then these should be closely considered before continuing with a full study.

To illustrate what a feasibility study would look like for either the Glasgow or Dundee model, we have used the CONSORT checklist extension for pilot studies.[47] This checklist is based on a set of criteria originally produced by CONSORT in 2010, which is an evidence-based, standardised, minimum set of recommendations for reporting randomised trials.[48] The extension for pilot trials provides a list of 26 items which need to be explained when reporting on a feasibility study of a randomised control trial, and which determine the key parameters for the trial. Below, we set out the parameters for a quasi-experimental evaluation of the Glasgow and Dundee Pathfinders (respectively) based on the items in this checklist. Although it is stated that the checklist "does not directly apply to internal pilot studies built into the design of a main trial, non-randomised pilot and feasibility studies, or phase II studies", studies of this sort – which we consider to include quasi-experimental approaches – "have some similarities to randomised pilot and feasibility studies and so many of the principles might also apply".[49] As such, we have adapted the checklist to exclude points which are specific to randomised controlled trials, and consider it now to be applicable to quasi-experimental approaches. In addition, there are some items on the checklist which relate to points which should be included in a report on a trial (including general reporting information and results), rather than design features of the trial itself, which we have excluded.

The adapted checklist and corresponding design options for the Dundee and Glasgow Pathfinder are set out below. These serve primarily as suggestions for what a feasibility study could look like and what points it could address, in order to help inform whether a feasibility study should take place. It is not exhaustive and would need further development at the point at which a feasibility study is conducted. Any pilot of a natural experiment should seek to establish the minimum data fields in any primary data collection for outcomes and matching / control variables, or availablity of these data and permissions and governance requirements for accessing these data if using administrative datasets.

One particular point that a feasibility study would pin down is the primary and secondary outcomes that will be measured. Due to the current uncertainties about what data will be available and what can be measured, it is not possible to set the outcomes measured at this point. Nonetheless, we suggest that the primary outcome should be household income, as this is the main determinant of poverty. If possible, household income relative to the national median could be used as a method of indicating whether a household is categorised as being in poverty (either absolute or relative). The secondary outcomes should measure the other factors that the Pathfinders seek to improve, such as health, housing, and education.

Quasi-experimental feasibility study design suggestions checklist

Specific objectives or research questions for pilot trial:

Dundee

  • Could DWP data be used to construct a comparison group?
  • Is the data collected on the treatment group comprehensive enough?
  • What timescales should be used to collect the data?
  • Collate data to inform the sample and comparison group size for a full evaluation.

Glasgow

  • Is the data collected on Pathfinder users comprehensive enough? Including the type of data and the length of time series.
  • Is it feasible to use population-level (i.e. whole of Glasgow) data and is it reasonable to expect the impact of the pathfinder to be detected?
  • Do other child poverty / family support interventions pose a barrier to running ITSA?

Description of pilot trial design:

Dundee

  • Difference-in-difference

Glasgow

  • Interrupted time series analysis

Settings and locations where the data were collected:

Dundee

  • DWP data used to identify target households
  • Housing benefit and council tax data
  • Dundee Pathfinder monitoring spreadsheet
  • Social Security awards from Pathfinders
  • Client spreadsheet
  • Exit interviews

Glasgow

  • Scottish Government child poverty data
  • Glasgow Helps Monitoring Data
  • Exit interviews

How participants in the research were identified and consented:

Dundee

  • Identified through DWP data

Glasgow

  • Open to all Glasgow residents

The interventions for each group:

Dundee

  • Intervention group
    • Key worker visits targeted households and directs them to support at Brooksbank centre, where they are offered a range of tailored services.
  • Comparison group
    • Has access to the same statutorily available services, but is not actively directed to them.

Glasgow

  • Intervention group
    • Primary intervention is a telephone helpline which conducts a needs assessment and then provides immediate interventions and / or referrals to other services.
  • Comparison group
    • Has access to the same statutorily available services, but is not actively directed to them.

Completely defined prespecified assessments or measurements to address each pilot trial objective specified, including how and when they were assessed:

Dundee

  • Proportion of households with child living in poverty (either absolute or relative)
  • Household income of those who used the service
  • Number of people claiming benefits
  • Number of people in employment
  • Number of people in housing arrears
  • Material deprivation
  • Educational outcomes
  • Housing situation.
  • Health indicators.

Glasgow

  • Proportion of households with child living in poverty (either absolute or relative)
  • Household income of those who used the service
  • Number of people claiming benefits
  • Educational outcomes
  • Housing situation
  • Health indicators

If applicable, prespecified criteria used to judge whether, or how, to proceed with future full-scale evaluation:

Dundee

  • Can a comparison group be constructed
  • What is balance pre-intervention between groups
  • What data and governance requirements are needed
  • Can a sufficient sample size be recruited / identified to detect a meaningful difference on child poverty / on what timescales would outcomes be observable?

Glasgow

  • Can a comparison group be constructed
  • What is balance pre-intervention between groups
  • What data and governance requirements are needed
  • Can a sufficient sample size be recruited / identified to detect a meaningful difference on child poverty / on what timescales would outcomes be observable

Rationale for numbers in the pilot trial:

Dundee

  • Number of participants should be maximized within what is possible for the pilot timescales

Glasgow

  • Number of participants should be maximized within what is possible for the pilot timescales

Method of obtaining comparison group:

Dundee

  • DWP data used to identify target households who chose not to participate in the Pathfinder or who are not in a Pathfinder delivery area.
  • Housing benefit and council tax data.

Glasgow

  • No comparison group required, unless it is found that there are other interventions affecting child poverty / family incomes. Determining this is an aim of the feasibility study

Methods used to address each pilot trial objective whether qualitative or quantitative:

Dundee

  • Changes in % of children living in poverty.
  • Changes in average household income.
  • Changes in sources of income.
  • Changes in the number of people claiming benefits.
  • Changes in number of people in housing arrears.
  • Changes in debt levels.
  • Changes in childcare costs.
  • Changes in fuel costs.
  • Changes in food costs.
  • Acceptability of pathfinder to funders, staff delivering and recipients

Glasgow

  • Changes in % of children living in poverty.
  • Changes in average household income.
  • Changes in sources of income.
  • Changes in debt levels.
  • Changes in childcare costs.
  • Changes in fuel costs.
  • Changes in food costs.
  • Changes in the number of people claiming benefits.
  • Feedback from participants on whether they are aware of / use any other child poverty support interventions.
  • Acceptability of pathfinder to funders, staff delivering and recipients

Timing of a quasi-experimental approach

Undertaking a quasi-experimental evaluation approach would involve a much longer timeframe than the other approaches to evaluating impact discussed later in this section. Before going ahead with this approach, the timing should be taken into account, in particular whether it can be conducted within the current Phase 2 timeline (which ends in March 2025). To give a sense of the time taken, an example timeline of a quasi-experiment, which used matched DiD, comes from the evaluation of the 'Strengthening Families, Protecting Children' evaluation conducted by What Works for Children's Social Care. The aim of this programme is to reduce the number of children entering care. This evaluation will use data from the ONS' National Pupil Database and administrative data from Local Authorities. For this programme, the first Local Authority included in the evaluation implemented the plan starting in September 2020 and the final Local Authority started in April 2022.The observation period for participants will end in March 2024 and data will be collected from the ONS in March 2025. After this, the evaluators predict that the analysis will take place from 2025-2026 with a final report due in 2026.

This timeline suggests that – broadly – a quasi-experimental approach would require around two years post-observation period to be completed. This is based on the set up and design of such evaluations typically taking around six to twelve months, with a similar amount of time required for obtaining a comparison group. Sufficient time would also need to elapse in the observation period in order for enough data points to be gathered, and to allow time for the effects of the Pathfinders to flow through to the outcomes measured. In the example above, an observation period of two years was used, while other previous studies also support allowing 18 months to two years before conducting a quasi-experimental evaluation.[50],[51]

Overall, this implies that completing a quasi-experimental evaluation before the end of Phase 2 in March 2025 would be challenging – and if a feasibility study was adopted, completion of a subsequent full quasi-experiment would not be possible within this time frame. The implication is that a quasi-experimental evaluation would likely require a shifting of Phase 2 timelines, something which is set out in more detail below.

Obtaining a comparison group

As previously discussed, quasi-experiments are a useful method to evaluate impact when it is not acceptable to randomise.[52] However, several quasi-experimental methods including DiD and PSM, still require a comparison group to measure the impact of an intervention. These pre-intervention comparison and treatment groups need to be as similar as possible to have a robust evaluation.[53]

Literature highlights several potential sources of a comparison group for a quasi-experiment. For example, a waiting list of potential participants can be used as the comparison group as they are already familiar with the intervention and are willing to participate in information-gathering steps, and they are most likely have similar demographic characteristics to those in the intervention.[54]

Additionally, data on from national administrative or survey datasets could also be potentially used to obtain a comparison group.[55] Those in the comparison group would have to be matched to those participating in the intervention through PSM.[56] However, in order to use administrative data, there must be similar outcome data (e.g. household income) and data to permit the construction of a propensity score that is recorded for both groups to allow for a comparison.[57] In the evaluation of both the Glasgow and Dundee Pathfinders, the obtination of a comparison group poses difficulty. Various methods specific to each Pathfinder are discussed below.

Glasgow Pathfinder comparison group

As previously mentioned, the Glasgow Pathfinder operates with a 'No Wrong Door' approach and anyone in the city can access the services. This makes obtaining a comparison group which is similar to intervention group more difficult and also introduces several ethical challenges. One possibility would be to look at the initial contact data, and for calls where no action resulted, obtain permission to collect data on these respondents to create a comparison group. However, obtaining a suitable number of both participants and non-participants to conduct this analysis will be another challenge. In addition, this method runs the risk of selection bias as the calls which did not result in an action are unlikely to be random and face the same level and type of needs as those who require action.

Alternatively, there is still the possibility of using administrative data, such as those from DWP or HMRC to derive a comparison group. The challenge in using administrative and national-level data for the evaluation of the Glasgow Pathfinder is that these data have not been collected for this purpose. Therefore, the evaluation would be limited to using only those variables which are included in these datasets, and these may not fit the needs of the evaluation. The variables available with the administrative data would be determined during the feasibility study / scoping phases of the evaluation, and the measurement of outcomes and indicators adjusted accordingly. Additionally, given that the Pathfinder operates throughout the city – and the only criterion to use the Pathfinder is to live in the city – a comparison group from administrative data would need to cover another entire city with highly comparable characteristics. This may be an unrealistic expectation given the complexities around measuring and attributing changes in child poverty or other indicators and the unique characteristics of Glasgow within Scotland.

Dundee Pathfinder comparison group

Due to the method of service delivery of the Dundee Pathfinder, it will be easier to gather data on non-participants and therefore obtain a comparison group. As previously discussed, using a programme's waiting list is a common method used to obtain a comparison group. This is because those on the waiting list may closely resemble the participants and it is easier to obtain consent and information from this group. The Dundee Pathfinder identifies postcodes with high deprivation and then approaches potential participants in these areas. Therefore, a comparison group could be composed of those families who were identified in these postcodes and decided not to participate, or alternatively families in areas of high deprivation, but whose postcodes were not chosen, could also compose the comparison group.

This method, however, would be unable to account for the distinct characteristics of this area and why it was targeted for the Pathfinder over other postcodes, or those of people who participate vs those who decline participation. Moreover, it is important to note that it is anticipated that the Pathfinder will expand to other postcodes, which may limit the extent to which it is possible to use them as a comparison group. Additionally, although the Dundee Pathfinder originally targeted the Linlathen area, an increasing amount of people from outside this area have heard about the service and have recieved help from it. This creates an extra challenge for obtaning a suitable comparison group for the Dundee Pathfinder as it may be harder to find an area with no exposure to the Pathfinder. Another key drawback is in the sample size: given that the number of families receiving support from the Pathfinder as it is currently rolled out is relatively small – meaning the control group would not need to be large – a quasi-experiment may be unlikely to provide statistically significant results with a small sample size, although as noted above it is anticipated that the Pathfinder will expand. The options around sampling and the likelihood of obtaining statistically significant results could be scoped out as part of the feasibility study.

Costs of a quasi-experimental evaluation

We estimate that commissioning a quasi-experimental feasibility study would require a budget of £75,000, with the subsequent full roll-out costing £225,000. For the feasibility study, this is based on an estimated 90 days work @ average £825 / day according to current market rates (between £650-£1000) + VAT = £74,250+ VAT, rounded to £75,000 to allow some contingency. This would comprise 30 days set up and design, 30 days for data collection and obtaining a comparison group, 20 days to conduct the analysis, and 10 days for reporting, reviewing and presenting results. The full study would then require approximately three times this volume.

Overarching considerations before implementing quasi-experimental approaches

Several quasi-experimental methods and their relevance to the Pathfinders have been discussed. However, there are several challenges that are common across quasi-experimental approaches which need to be addressed. Primarily, to conduct a quasi-experiment, the outcomes being evaluated need to be clearly defined, tangible, and measured quantitatively.[58] Thus, for evaluating the Pathfinders, the objective of reducing child poverty would need to be fully defined and be able to be measured with quantitative data (e.g. household income). Additionally, several of the methods above rely on having defined intervention parameters. Therefore, the Pathfinder 'intervention' would also need a precise definition of who is included as a participant (especially in the case of Glasgow which offers more light-touch support) and when the 'intervention' is complete. The point at which an exit interview is carried out may represent the end point of the intervention on an individual basis, but the support provided by the Pathfinders is designed to happen in multiple doses over time and to be adaptable to changing circumstances. As such, defining a clear end point is inherently difficult.

In addition, because of the lack of suitable data that has previously been collected, a quasi-experimental approach may need to rely on the use of administrative or national-level survey data. This data is not currently held by the Pathfinders and so serparate applications to access this data would need to be filed. However, until the intial phases of the evaluation have begun and a data sharing agreement is made, it is uncertain (a) what data will be needed to conduct the quasi-experimental approach, and (b) what data is actually available. As such, the evaluation would be limited to utilising only the variables collected in these datasets, and at present it is not known whether this will enable a robust evaluation to be conducted. A full review of what data is and is not available would form part of the feasibility study (or early evaluation stages if a feasibility study is not adopted), after which point the measurement of outcome and indicator variables can be finalised.

A quasi-experimental approach also presents some ethical challenges around consent, data collection and usage. Obtaining the informed consent of everyone in a comparator group is likely to be challenging given that they are not engaged with a common service, and processing of personal data for research purposes would not be ethical without consent. Administrative data for both groups, however, could be compared provided that they were fully anonymised before sharing or linking, and that there is a clear ethical and data governance justification for this. Part of this agreement to share or link data should therefore include the key ethical considerations and justification for what data are accessed and how they are used.

Moreover, due to the data availability constraints previously mentioned and the need to have quantitative outcome measures, a quasi-experimental approach will only capture quantifiable outcomes on a small number of specified variables. A holistic service such as the Pathfinders, however, is expected to have a wider range of impacts on the participants, such as increased confidence or family wellbeing, which are not routinely measured in administrative data. Going forward, these measures could be added to the primary data collected by the Pathfinders. A mixed-methods assessment approach (discussed below) would be more flexible to incorporating these types of outcomes.

Last, the Pathfinders are a complex, evolving, and loosely defined intervention. One of the fundamental principles behind the Pathfinders is that they provide bespoke, holistic support based on individuals' personal circumstances and needs. For example, the Dundee Pathfinder operates as a navigator model, which provides individual and holistic support. As a result, the Pathfinder intervention does not fit into a single box nor should be expected to achieve a single clear outcome for everyone. By contrast, quasi-experimental approaches rely on interventions with clear parameters which produce distinct, measurable outcomes. If a quasi-experimental approach was implemented and found no statistically significant impact of the Pathfinders on the selected outcomes, this may lead to a conclusion that they were not successful in achieving that outcome. However, these findings could overshadow the potential successes in other key facets of the programme which cannot be captured or otherwise reflected in a quasi-experimental approach. Therefore, we consider there to be a tension between the underlying ethos of the Pathfinder programmes, and that of quasi-experimental approaches.

Randomised controlled trials

If there was a need or desire to go beyond a quasi-experimental evaluation and conduct a full experiment on the Pathfinders, this could be done through a randomised controlled trial (RCT). Whereas quasi-experimental approaches identify a natural comparison group after the intervention has occurred, RCTs create a control group by randomly assigning people to either receive the treatment (in this case, Pathfinder support) or not. If feasible to implement, RCTs provide the highest standard of evidence of causal effects.

We are not aware of any RCTs on interventions in the UK that are closely similar to the Pathfinder programmes. However, there are some instances of RCTs being used to test child poverty interventions elsewhere. A recent study in Norway looked at the impact of a specific family intervention model compared to standard local family intervention practices.[59] This did not measure child poverty, but rather focussed on parental employment, financial situations, housing situations, and the social inclusion of children. This took place across 29 Labour and Welfare offices and analysed survey responses and administrative data on 862 parents over 12 months.

In Uganda, an RCT was used to calculate the effect on child poverty of savings incentives, mentorship and financial training on 1,383 orphaned children.[60] Like the Pathfinders, the interest of this programme was that it was a multi-faceted intervention, as opposed to a single treatment. The study noted that capturing child poverty directly was challenging, and so a representative measure of poverty was used which included health, assets, housing and behavioural risks. The impact of the intervention was measured after a period of four years.

There is also an RCT currently in operation to measure the impact on child poverty of the Healthier Wealthier Families programme in Sweden which provides financial advice to families who need it.[61] This will randomly assign parents to be immediately referred to local budget and debt counselling services, with the control group being referred after a delay of three months. It is expected to include 142 participants and to observe a two-year period with study completion an additional year thereafter. Again, this RCT will not measure absolute or relative poverty directly, but rather compares outcomes on child material and social deprivation, household income, and other self-rated outcomes such as mental health.

The level of existing research involving RCTs on child poverty interventions implies that an RCT could be used to evaluate the Pathfinders, but that a pilot trial would likely be required to test the various components – and the extent to which a trial suits either Pathfinder – of this first. As outlined above in relation to a quasi-experimental feasibility study, the aim of a pilot trial would not be to provide results in relation to the Pathfinders' impacts, but rather to estimate the parameters that would be required in a full roll-out of a trial.

Should an RCT pilot study be pursued, the existing evidence outlined above indicates that the outcome(s) measured should not be direct child poverty measures (relative or absolute), but instead other key indicators such as household income, employment, housing circumstances, childrens' school attainment / attendance. Similar to quasi-experimental methods, six months to a year would likely be required to complete a pilot trial, with a full RCT taking over two years for completion. As above, this would take the evaluation beyond the current timelines for Phase 2. The appropriate design of an RCT would likely be different between Glasgow and Dundee. We suggest that in Glasgow, the randomisation is done on a time-delayed basis, whereby the treatment group receives the support and onward referrals from the phone service immediately, while the control group does not receive support right away, but does at a later point in time. For the Dundee model, it may be appropriate to use a cluster-randomised approach, which randomly allocate entire areas to be the treatment and control groups. A clustered approach would help to avoid complications around obtaining consent at an individual level to be randomly allocated to receiving or not receiving support. The optimal allocation approach would be finalised as part of the pilot phase.

When deciding whether an RCT should be used to evaluate, the following advantages and disadvantages should be considered.

Strengths

RCTs provide the most robust results in terms of identifying the scale of impacts and attributing these to a cause.

RCTs would avoid much of the issues and uncertainty associated relying on data from administrative datasets in quasi-experimental methods, because data for both the treatment and control groups would be collected directly from the trial participants.

Limitations

RCTs frequently raise ethical issues. These concerns are particularly pronounced in the context of child poverty, where the difference between receiving the intervention and being in the control group could have life-changing effects on children and their families. There would be two ethical hurdles to overcome. The first is to provide justification for randomly allocating families to the Pathfinder. Because the Pathfinders are targeted at particularly marginalised groups, the impact of receiving support could be life-changing. Therefore, it may be challenging to justify whether this support can be allocated on a random basis. Second, consent from all participants in the trial would be required. Previous studies show that obtaining consent is possible, but there may be specific difficulties in doing so where potential participants do not perceive random allocation as fair or do not wish to be allocated to a control group.

RCTs require specific conditions to work properly, including a clearly defined intervention. As explained throughout above under 'overarching considerations before implementing quasi-experimental approaches', the Pathfinder models are complex both in terms of what they do and what they aim to achieve. It may be that the Pathfinders (or one of them) cannot feasibly fit into a robust trial design. A pilot study would help to address or otherwise confirm this.

Due to the fact that it reaches a relatively small number of families, the Dundee Pathfinder – by nature of its design – is somewhat limited to a small sample size for an RCT. Having a small sample size may limit the power of an RCT, and could mean that statistically significant results are only possible if a vast change in outcomes occurs. The extent to which this is a limiting factor could be explored further with a preliminary pilot trial before full roll-out.

Conducting an RCT involves substantial financial resources and would likely require a larger budget than the quasi-experimental approaches described above.

On the basis of the above, our current recommendation is that an RCT would not be a suitable method of evaluating the Pathfinders. Notwithstanding this, if an evaluator were to pursue this approach, we recommend that they first conduct a pilot trial, and in doing so ensure that the disadvantages listed above can be overcome.

Impact assessment

The following section outlines additional options to evaluate the Pathfinders which do not use a counterfactual. These options have strengths and limitations which will be discussed below. The options include:

Option 1: To plan an impact assessment of the Pathfinders using theory-based evaluation in the second half of 2024/25. This is to enable enough time for impacts to accrue. This could run concurrently with any quasi-experimental trial commissioned. This would be particularly important to enable learning to be fed into the Third Tackling Child Poverty Delivery Plan.

Or

Option 2: To develop case studies to supplement a quasi-experimental trial in Phase 2.

and

Option 3: To provide support to the Pathfinders to improve their monitoring, evaluation and learning processes throughout 2023/24 and 2024/25 to ensure they are set up and delivered in the best possible way to maximise their impacts and the value of an evaluation.

As described in Best Start, Bright Futures the Pathfinders will trial different innovative approaches to support changes in the child poverty system. They will test, refine, and adapt these approaches as they learn how best to deliver holistic and person-centred support that meets the specific needs of families.

In this context, where the overarching aspiration of the Pathfinders is to contribute to whole system change in the long run, in the knowledge that the child poverty system is a complex system, this evaluability assessment finds there is value to conducting a longer-term theory-based impact assessment that enables Scottish Government and the Pathfinders to understand what has changed, why and how the changes have occurred.

This is also important to feed into learning for the continued implementation of the Pathfinder interventions beyond 2025 or for their future replication and scalability. We suggest Contribution Analysis as a feasible methodological approach (discussed below) to be used in an impact assessment. Conducting a theory-based impact assessment is not mutually exclusive to running a quasi-experimental trial depending on the outcomes of the feasibility study but should supplement the trial with learning on what has or hasn't worked and why.

A second option suggested involves developing case studies to supplement a quasi-experimental trial to enable greater understanding of how the impacts may have occurred and to provide greater understanding of the dimensions of change in the finances, health and wellbeing of families involved in the Pathfinders.

Finally, in the context of the Pathfinders testing and exploring different approaches, an evaluative approach that is embedded in delivery that enables efficient feedback loops so that learning is captured and can support implementation and adaptation is also recommended. We suggest a Learning Partner approach as a feasible methodological approach (discussed below).

Theory-based evaluation, a learning partner approach and / or case studies will provide useful understanding with regard to:

Family outcomes. A qualitative approach that involves engaging families that have been part of the Pathfinders to understand the range of outcomes for families as well as their overall experience and stories of change will be useful in understanding the mechanisms of change in the Pathfinder. This can be supplemented with quantitative assessment, relying on surveys or analysis of administrative data to understand pre and post effects.

Local context. Project level outcomes can be placed in a wider understanding of the context in which the Pathfinders are working. This includes the social, political and economic context within which local, regional and national partners are acting.

Understanding the experiences of priority family groups. These approaches can be used to gain a deeper understanding of the experiences of the people most at risk of poverty, who are often 'hard to reach' and 'seldom heard.'

Systems changes. A key aspect of the Pathfinders is to facilitate a process of systems change in their localities, where partners work better together and work to remove barriers. These approaches enable us to understand the drivers of change and the extent to which the Pathfinders' approach is replicable across other localities.

Option 1: Impact Assessment using Theory-Based Evaluation Approaches

We define impact assessment in this section of the report as the utilisation of non-experimental approaches to assess the extent to which the Pathfinders have contributed to outcomes or impacts and why and how they have been realised.

Strengths:

Theory-based impact assessment approaches are useful in allowing for greater understanding of how an intervention may be contributing to impacts in complex and dynamic settings where there may be numerous ways in which inputs and outputs interact and multiple pathways that can lead to the envisaged change.

They are also useful in contexts where direct attribution of an intervention to impacts can be difficult to ascertain due to the unpredictable ways in which outcomes are influenced by multiple internal and external factors.

They can also be used to assess impact at a programme level, as opposed to conducting an impact assessment for each Pathfinder.

They allow us to answer questions such as: "What was it about the intervention or the context that caused the results? Where the expected results were not observed, what was it about the intervention that didn't work? Was the underlying theory of the intervention wrong, or was the problem a case of poor implementation?"[62] These questions are useful for understanding whether the intervention can be replicated and how it can be improved.

They can help to focus data collection on the 'outcomes that matter', enabling an effective monitoring system to be set up.

Limitations:

These approaches do not enable us to understand how much difference is made by an intervention, that is, they cannot provide a quantitative measure of the amount of change that has occurred as a result of the intervention as compared to a non-treatment area (the amount by which absolute or relative child poverty has been reduced).

Attribution cannot be determined using these approaches. Assessment can only be made on the extent to which the Pathfinders may have contributed to the outcomes/impacts identified e.g. A family's well-being may be impacted by numerous factors other than the support provided by the Pathfinder. Here, "the relevant evaluation question is: in light of the multiple factors influencing a result, has the intervention made a noticeable contribution to an observed outcome and in what way?"[63]

Impact assessments can be time intensive. However, data collection and analysis will be significantly aided if a good monitoring system is put in place.

Theory-based evaluation options

Box 1: What are Theories of Change?

Theories of Change are commonly used to understand the logic of social change interventions. Theories of Change provide a description of the causal pathways that are expected to lead to desired outcomes. They are used to make the many underlying assumptions about how change happens in a programme or project explicit. Theories of Change can be flexible and can support innovation and improvement in programmes by allowing programme implementers to check, debate and test assumptions making them well suited for understanding change in complex settings.

Developing a well-articulated and robust Theory of Change, co-designed with key stakeholders, and underpinned by existing bodies of evidence, is a good starting point for assessing an intervention's contribution to impact particularly in complex settings. (Vogel, 2012)

Theory-based evaluation approaches to impact assessment are non-experimental approaches that are concerned with understanding the 'theory' of an intervention. Theory-based approaches are method neutral. They are a 'conceptual analytical model'; "a way of structuring and undertaking analysis in in evaluation".[64] As described above, theory-based approaches seek to understand the contribution of an intervention to observed outcomes and impacts through a detailed examination of the mechanisms / processes of change, assumptions and external factors rather than making comparisons with a counterfactual to determine causation.

There are broadly two different types that are relevant here: Realistic Evaluation; and Contribution Analysis using ToCs.[65]

A third type of evaluative approach – Developmental evaluation – which involves embedded evaluation to improve the implementation of delivery (and its ability to respond and adapt to the context) is discussed in Option 3.

Option 1A Realist Evaluation

In Realist evaluation, programmes are theories which approximate to reality in existence. Realist evaluation is guided by the belief that interventions only work under certain conditions and that the impacts of interventions will differ according to the stakeholders involved and the contexts within which they act - succinctly summed up as "mechanisms + context = outcomes".[66] In realist evaluation findings demonstrate what worked, for whom, how and in what circumstances.[67] Thus, the focus in realist evaluation is rarely on accountability but on learning. "The extent to which a specific intervention has 'succeeded' or 'failed' … is of limited interest, given that it cannot be seen as providing reliable insights as to the outcome of future similar interventions."[68]

Strengths

  • Supports learning and understanding of how a specific intervention works through the testing of underlying theories
  • Can be used to inform experimental/semi-experimental approaches on the process of change
  • Method-neutral, supports a variety of analytical approaches[69]

Limitations

  • Is time consuming and resource intensive
  • Requires subject-matter expertise to undertake; and
  • It may not provide an average net effect of the intervention
  • It provides limited insights on the outcomes of future similar interventions[70]

Overall assessment:

Realist evaluation is not a suitable approach to be used for an assessment of the impact of the Pathfinder programme or the Pathfinders themselves as a key objective of the evaluation is to determine not only what has worked in what context, but whether the intervention has been successful and whether there are lessons that can be learned as to its replicability or scalability.

Option 1B Contribution Analysis

Contribution analysis (CA), a complexity informed approach, has been increasingly used for complex programme evaluations over the past decade. As the name suggests contribution analysis does not claim outright "attribution" of impacts to interventions but rather seeks to create a story built on available evidence that establishes how interventions may have had an influence on overall impacts. "Contribution Analysis is an approach to evaluation developed by Mayne (2001, 2008, and 2011) which aims to compare an intervention's postulated Theory of Change against the evidence, in order to come to robust conclusions about the contribution that it has made to observed outcomes." "Verifying the Theory of Change that the programme is based on, and paying attention to other factors that may influence the outcomes, provides reasonable evidence about the contribution being made by the programme." "Contribution analysis argues that if an evaluator can validate a ToC with empirical evidence and account for major external influencing factors, then it is reasonable to conclude that the intervention has made a difference… Causality is inferred from the following evidence:

  • The intervention is based on a reasoned ToC: the results chain and the underlying assumptions of why the intervention is expected to work are sound, plausible and agreed to by key players.
  • The activities of the intervention were implemented
  • The ToC is verified by evidence: The chain of expected results occurred, the assumptions held, and (final) outcomes were observed.
  • External factors (context) influencing the intervention were assessed and shown not to have made a significant contribution, or if they did, their relative contribution was recognized."

Strengths

  • Supports learning and understanding of how a specific intervention works through the testing of underlying theories
  • Method-neutral, supports a variety of analytical approaches
  • Can validate a ToC or support the adaptation of the ToC
  • It is a participatory approach, recognising the importance of broad stakeholder engagement to validation of evaluation measures[71]

Limitations

  • Attribution cannot be claimed, though causality can be inferred if other factors can be shown to have had minimal influence.
  • Relies on a robust, clearly articulated and well-evidenced ToC.
  • May not be suitable if there are significant changes to the ToC.

Overall assessment:

Contribution Analysis is a useful approach to assess how and why an intervention may have led to outcomes. Evaluators are able to examine available evidence to interrogate the programme's/ project's ToC and assumptions and use this to come to conclusions on whether the programme/ project is contributing to outcomes as outlined in the ToC.

Recommendation: Contribution Analysis is recommended as an approach to be used in a formative or a summative evaluation which occurs either at the end or during the implementation period of the Pathfinders in order to assess how the Pathfinders are contributing to observed outcomes and impacts.

Recommended: Option 1B: Plan for Implementation of Contribution Analysis Approach

Outlined below is a proposed approach for a recommended independently commissioned by the Scottish Government impact assessment of the Pathfinders using contribution analysis towards the end of their implementation period. This is in addition to any quasi-experimental trials that may be commissioned.

Aim

To assess the impact of the Pathfinders by examining to what extent and how the Pathfinders have been effective in providing holistic person-centred support that contributed to improved resilience, health and wellbeing, incomes, and reductions in costs of living as key drivers for the reduction of child poverty. This will provide learning on the processes that have led to observed changes and understanding on how and whether they may be adapted and delivered elsewhere.

Box 2: The challenges of evaluating the impact of interventions seeking systems change.

Systems change, where systems are understood as "constructs used for engaging with and improving situations of real-world complexity,"is a non-linear process (Reynolds and Holwell, 2010). It is characterised by "emergence (behaviours or other things that arise as a result of the interactions between parts of a complex system), co-evolution (parts of the system react and respond to one another's behaviour), and self-organisation (the tendency for systems to generate new structures and patterns based on internal dynamics)" (Reynolds and Holwell, 2010). Effecting change in a system needs a 'whole-systems perspective', which means working simultaneously at multiple levels, focusing not only on the individual units of adoption (organisations, sectors or personnel) but also on the interactions between the parts of the system.

Deborah Ghate notes that an important implication of working in this way, that is, dealing with systems change, is that attention needs to be placed "more clearly on causal pathways, and on leverage points for change that may exist at different levels"in implementation (Ghate, 2022). There are also implications for evaluation. A literature review by Morton (2019) exploring approaches best suited to the evaluation of complex public services outlines the following key implications of taking a complexity informed understanding of system behaviour:

Interventions must be understood within their context, and contextual factors must be considered when assessing impact.

Programmes must make their assumptions explicit and spend some time framing issues to enable an evaluation to reflect the system.

Evaluation approaches may need to change to be more problem-orientated, collaborative and inclusive of multiple kinds of evidence.

Evaluation may best be embedded into an intervention and/or agency as a developmental process. When evaluation becomes a feedback loop in the system, it can be used to adjust and refine interventions as they develop. In this context, the role of the evaluator shifts from solely providing information to facilitating change and adaptive management.

Feasibility of conducting an evaluation of the Pathfinders using contribution analysis

As part of the commission, we have co-designed, with the Pathfinders, ToCs with assumptions and risks, and developed a detailed monitoring framework based on data audit exercises undertaken with the Pathfinders (see ToC and Monitoring report). The latter provides a framework to enable the collection of relevant and necessary data to support a theory-based evaluation. However, the framework though necessary is not sufficient. The Pathfinders also will need to:

improve data collection tools and systems to collect both quantitative and qualitative data needed from staff, partners, as well as people and families who engage with the Pathfinder

improve their internal monitoring systems to store and analyse the data collected

develop a regular process to reflect on progress against systems change outcomes in the ToC and to report against the indicators in the monitoring framework to funders, partners and stakeholders.

Box 3: Contribution analysis process (Mayne, 2008)

Step 1: Set out the attribution problem to be assessed

Step 2: Develop a Theory of Change and risk to it

Step 3: Gather the existing evidence on the Theory of Change

Step 4: Assemble and assess the contribution story, and challenges to it

Step 5: Seek out additional evidence

Step 6: Revise and strengthen the contribution story

"[W]ithin contribution analysis, a plausible narrative is considered to have been developed when four different conditions are met (Mayne 2008).

1. The ...intervention is based on a sound Theory of Change, accompanied by agreed and plausible assumptions, that explains how the intervention sought to bring about any desired changes.

2. The activities of the … intervention were implemented properly.

3. There is adequate evidence showing that change occurred at each level of the Theory of Change.

4. The relative contribution of external factors or other development interventions can be dismissed or demonstrated." (INTRAC, 2017)

In addition, an independently commissioned external theory-based evaluation will not just rely on the data collected by the Pathfinders but will also collect additional evidence to assess the impacts on families beyond the timeframe of the Pathfinder. Overall, the evaluation will draw on the following forms of data:

Qualitative and quantitative data collated by the Pathfinders in their internal monitoring systems and reports

Evidence and data reported against the monitoring framework

Additional evidence, collected as part of the independently commissioned evaluation, for example, through independent interviews and surveys conducted by the evaluation team.

In sum, for an evaluation using contribution analysis to be done well, the following elements are needed:

A robust Theory of Change for the Pathfinders and/or the Pathfinder programme

  • A ToC has been co-produced with each of the Pathfinders as part of this evaluability assessment. A ToC has also been developed for the programme. The finalised versions of the ToCs need to be shared and reflected upon with the Pathfinders. However, the current versions provide a good basis for undertaking a future evaluation using contribution analysis. It is recommended, however, these ToCs are revisited and updated prior to the commissioning of the impact assessment.
  • Robust data that has been collected by the Pathfinders, using appropriate monitoring frameworks that allow assessment against all levels of the ToC
    • The Pathfinders are currently collecting data in a variety of ways. The Glasgow Pathfinder has a monitoring system in place. The Dundee Pathfinder does not have a streamlined process for monitoring. As identified in the ToC and Monitoring report, the Pathfinders are not currently collecting all the data that is needed to make an assessment of progress against the ToCs. Suggestions for improvements to the Pathfinders' monitoring systems are made in the accompanying report. In summary, the Pathfinder data collection processes and systems will need to be improved and aligned with the monitoring framework.
  • Additional data and evidence is needed a) to triangulate evidence collected from the Pathfinders, b) to gain an insight into longer term changes resulting from the Pathfinders/ Programme and the sustainability of the Pathfinders/ Programme
    • Additional evidence will need to be collected by the evaluation team commissioned. The expectation is that a mixed methods approach will be used to collect the additional qualitative and quantitative data for example, a review of documentary evidence from Pathfinders, qualitative interviews with families, stakeholders, staff and partners, and a survey (building on the baseline and monitoring surveys developed by the Pathfinders as suggested in the ToC and Monitoring report). The evaluation team should gather evidence that will be able to assess the persistence of effects (sustainability of employment) by interviewing or surveying families sometime after their engagement with the Pathfinder has ended. Any proposal submitted should explain the methodology for undertaking a theory-based evaluation, for example, revising ToCs, identifying data gaps and undertakingg interviews and surveys.

Recommended Approach

The recommended evaluation approach involves using contribution analysis principles to assess progress of the Pathfinders against their ToCs and to test the assumptions. Contribution analysis, in this approach, involves developing evaluation questions specifically to test the assumptions in the ToCs and to test whether outcomes have been achieved. Evaluation questions can be developed against the Pathfinders' ToCs or against the Programme-level ToC. Below we focus on the Pathfinders, as the evaluation questions developed relate to the Pathfinders and a more extensive data audit exercise has taken place for the Pathfinders but a similar approach could be developed for the Programme level.

Box 4: Mixed Methods

Theory-based evaluation approaches are method neutral. Contribution analysis in an evaluation, for example, can be used alongside the experimental design approach. "In short, while quantitative methods produce data that can be aggregated and analysed to describe and predict relationships, qualitative research can help to probe and explain those relationships and to explain contextual differences in the quality of those relationships." (Garbarino and Holland), 2009)

Quantitative methods

Quantitative data will be essential for determining the scale of the impacts – how many families have been supported, and how big a positive impact this had. This includes data on the reach of the Pathfinders, financial circumstances (increases in income from employment or benefits), health and overall wellbeing. Crucially, to measure the changes in these indicators, the same information would need to be collected at the end of a family's Pathfinder journey to provide a before and after comparison. For this reason, a survey is recommended in the ToC and Monitoring report. In the evaluation of the Welsh Government's Child Poverty Strategy, 23 indicators consisting of relative and in-work poverty, employment and worklessness, education, qualifications, housing, and health inequalities, were used to measure the Strategy's impact. The evaluation looked at the change in these indicators, using data from 2005 as the baseline year, and the most recent data available for each indicator in 2014. (Welsh Government, 2014). They determined if these changes were statistically significant and compared these trends to those of the North of England, and concluded that there was no evidence of the Welsh Strategy being more or less effective than the policies in the North of England.

Qualitative methods

Qualitative methods will allow the evaluation to:

Understand why things happened the way they did, and what the drivers of change were.

Gather information on what worked well and what could be improved.

Take individuals' barriers and values into account.

Collect longitudinal data to provide ongoing learning at different points.

Incorporate the views of families and partner organisations.

Take a person-centred approach.

Qualitative data can also serve to supplement the quantitative data discussed above, in order to capture process and outcome data. This is supported in the literature, which demonstrates that qualitative data is helpful for getting a deeper understanding of the impacts and outcomes, as administrative data used in large scale evaluations can often be hard to disaggregate to the local level or understand the impact on small priority groups (SG, TCPDP, Annex 2, 2022-2026) Having qualitative feedback from families would also help to evidence the level of attribution that can be assigned to the Pathfinder, as it can indicate whether a positive change was a result of this service or some other factor. Qualitative interviews have been used successfully in evaluations of similar programmes, such as the Local Authority Child Poverty Innovation Pilot and the Welsh Government evaluation of the Housing Act (2014), which addressed houselessness (GHK, 2010). These both relied extensively on interviews with service users, service delivery staff, stakeholders, and partners. It was noted that this method was vital in cutting through the complexities of the issues at hand, providing comparative perspectives, and incorporating the views of people with lived experience of poverty / homelessness. The former of these two evaluations also used the findings from the qualitative fieldworks to inform the quantitative aspects of the research including a Cost-Effectiveness Analysis.

Evaluation Questions

The following evaluation questions were co-produced with the Scottish Government (see the evaluation plan section). Using a contribution analysis approach, they will be used to examine and interrogate the links and outcomes in the ToCs of both the Dundee and Glasgow Pathfinder. We have included the data sources for each Pathfinder separately in the table below.

The ToC and Monitoring report describes in more detail what data is available currently in the Pathfinders, its suitability for the evaluation and what needs to be done to improve data collection and monitoring to support an effective evaluation. In this table below, data is highlighted where it is available though the quality maybe variable (see Monitoring Report for more detail). For example case notes in the Dundee Pathfinder are available though baseline information is not collected against the indicators in the Monitoring framework in a systematic way for every family to enable effective comparison and analysis. For example, childcare costs are not calculated for every family at the time of engagement with the Pathfinder. Where data is accessible but not yet being used or the data is not available yet (e.g. the pre/post survey) this is also noted in the table. The Monitoring Framework describes in more detail how data from these sources should be used. Where data is currently not accessible, and further investigation will need to occur to determine whether it is a viable data source this is highlighted.

It should be noted here that if the Pathfinders improve their monitoring systems, most of the relevant data to address these questions would have been collected against the the Monitoring Framework developed. However, some process questions related to the shorter-term outcomes in the ToC (e.g. the levels of 'How they feel' and 'What they learn and gain') will not have regular data collected against the Monitoring framework. This is because as explained in the Monitoring report, the Monitoring Framework was designed a) to be a practical tool that could realistically be implemented by the Pathfinders b) to focus primarily on the indicators that are most easily measured c) to focus on the longer term outcomes and impact of the Pathfinder.

Consequently, the successful evaluation team will need to review in addition to the Monitoring Framework data additional data including qualitative documentation held by the Pathfinders in their monitoring systems as well as undertake additional data collection to assess the persistence of outcomes and longer-term impacts. This data collection may include interviews with families, staff and partners at the local and national level involved in the Pathfinder as well as potentially a survey. The survey will be most valuable if baseline data is available.

For this reason it is a recommendation of this evaluability assessment that the Pathfinders are supported to develop monitoring systems that effectively collect a baseline and that are closely designed with the evaluation in mind. That is, they are a) set up to collect baseline data b) they continue to collect data on families after their direct involvement in the Pathfinder (See ToC and Monitoring report)

Evaluation Questions against the levels in the Theories of Change

Level in the Theory of Change: What difference does it make

Evaluation Questions

What was the impact on the finances, employment, resilience, health, and wellbeing of families?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System[72] extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc), feedback from families
  • Income maximisation data held by DCC and SSS[73]

Data not currently accessible, and further investigation will need to occur to determine whether it is a viable data source

  • Data extracted from universal credit accounts held by DCC
  • Benefits data held by DWP
  • HMRC RTI Data from DWP
  • Customer Information System data from DWP
  • SSS benefits data

Data is accessible but not yet being used or the data is not available yet

  • Pre-Post survey of families
  • Housing Benefit and Council Tax Reduction Data
  • Common Housing Register
  • Discretionary Housing Payments Datasets
  • Scottish Welfare Fund Datasets
  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System[74] extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Financial Gain estimates[75]
  • Pathfinder programme documentation (highlight reports etc), feedback from families

Data not currently accessible, and further investigation will need to occur to determine whether it is a viable data source

  • Data extracted from universal credit accounts held by DCC
  • Benefits data held by DWP
  • HMRC RTI Data from DWP
  • Customer Information System data from DWP
  • SSS benefits data

Data is accessible but not yet being used or the data is not available yet

  • Pre-Post survey of families
  • Housing Benefit and Council Tax Reduction Data
  • Discretionary Housing Payments Datasets
  • Scottish Welfare Fund Datasets
  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Level in the Theory of Change: What they do differently

Evaluation Questions

To what extent is the right support available for families – are there any gaps in resource, partners or services leading to unmet need?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families
  • Income maximisation data held by DCC and SSS

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Financial Gain estimates
  • Pathfinder programme documentation (highlight reports etc), feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Evaluation Questions

To what extent are families accessing support before crisis point – prevention? (Glasgow)

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families
  • Income maximisation data held by DCC and SSS

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc), feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Evaluation Questions

To what extent are families obtaining and sustaining high quality and fair employment?

Where will data come from in the Dundee Pathfinder?

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Evaluation Questions

To what extent are agencies working in partnership beyond organisational boundaries?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews, Pathfinder programme documentation (highlight reports etc)
  • Feedback from families
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Evaluation Questions

To what extent are employers providing more fair, flexible work locally that is more accessible for families in poverty?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc), feedback from families
  • Employer Portfolio Spreadsheet
  • Childcare Providers Data

Data not currently accessible, and further investigation will need to occur to determine whether it is a viable data source

  • Work Coach History Notes

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from
  • Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc), feedback from families

Data not currently accessible, and further investigation will need to occur to determine whether it is a viable data source

  • Work Coach History Notes

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Evaluation Questions

To what extent is the required data, learning and insight shared between partners?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc), feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Evaluation Questions

To what extent are public and third sector organisations working successfully together in partnership?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc), feedback from families
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc), feedback from families
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Evaluation Questions

To what extent are resources between partners pooled successfully where needed?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Evaluation Questions

To what extent has the Pathfinder helped resolve barriers at the local and national level

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews, Pathfinder programme documentation (highlight reports etc)
  • Feedback from families
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Level in the Theory of Change: What they learn and gain

Evaluation Questions

To what extent is support co-produced with families?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Evaluation Questions

To what extent is the support provided to families more flexible, holistic, targeted to need, and accessible?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc), feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Evaluation Questions

What was the impact on the families' confidence to manage future challenges?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Level in the Theory of Change: Who with/How they feel

Evaluation Questions

Who received support from the Pathfinder -scale and demographics?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Data reported against the Monitoring Framework

Evaluation Questions

To what extent were families in need successfully identified, engaged and supported – are there areas of unmet need?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from case notes
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Holistic Needs Assessments
  • Exit Interviews
  • Pathfinder programme documentation (highlight reports etc)
  • Feedback from families

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey

Evaluation Questions

To what extent is the Pathfinder sharing learning and influencing stakeholders to work differently?

Where will data come from in the Dundee Pathfinder?

Data available though quality may be variable

  • Dundee Pathfinder Internal Monitoring System extracting data from Pathfinder programme documentation (highlight reports etc)
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Where will data come from in the Glasgow Pathfinder?

Data available though quality may be variable

  • Glasgow Helps Internal Monitoring System extracting data from Pathfinder programme documentation (highlight reports etc)
  • Partnership Data

Data is accessible but not yet being used or the data is not available yet

  • Additional data collection by evaluators from Interviews and Survey
  • Data reported against the Monitoring Framework

Recommended Option 1B: Timelines and Costs

Timeline: Towards the end of 2024/2025. This is to allow sufficient time for impacts to emerge, for people to enter into employment, sustain employment, gain benefits from reductions to costs of living and to identify evidence of systems change

Costs: It is anticipated an impact evaluation would occur over 6 months, at an estimated 80 days work @ average £825 / day according to current market rates (between £650-£1000) + VAT = £66,000 + VAT including 15 days planning, desk research, creation of data instruments, 40 days field work, 15 days data analysis and 10 days report preparation. The team should include senior level expertise in contribution analysis for evaluation. The budget for the evaluation should also include costs for workshops with the Pathfinder core teams to revisit and revise the ToCs and reassess the data audit process. The overall budget is estimated at around £90,000 including VAT.

Option 2: Case Studies

A second option for assessing the impact of the Pathfinders involves developing case studies to supplement any quasi-experimental trial commissioned. The objective of the case studies is to provide a deeper understanding of the Pathfinders, how they work, and what processes may or may not have led to successes. Ultimately, the aim of the case studies is to identify what has worked, for whom and why and to provide learning to feed into future approaches for tackling child poverty.

Strengths

  • Provides rich and detailed qualitative understanding of how the Pathfinders have worked, from a process perspective
  • May be valuable in understanding the extent to which systems change has occurred and how
  • Rely on the analysis of multiple sources of evidence, supporting a deep and robust understanding of change
  • Provides valuable stories of change, context and the lived reality of the families involved in the Pathfinders
  • Can support learning to feed into future approaches to tackling child poverty including the third Tackling Child Poverty Delivery Plan.
  • Can enable us to test and validate the ToC
  • Can be used at any point in the timeline of the Pathfinders to support understanding (for example, of process or outcomes).

Limitations:

  • Case studies can present issues around external validity, or generalizability. Findings may not be generalisable as the case study focuses on the specifics of the one case.
  • Case studies carry the risk of bias from the subjective interpretation of the evaluator/researcher.
  • While a deep understanding of the experience of families can be gained, without a control group, it is difficulty to robustly determine if the changes would have occurred had the Pathfinder not existed.

Options for Case Studies

We outline two options for implementing a case study approach below.

Option 2A Qualitative Comparative Analysis

Qualitative Comparative Analysis (QCA) is a research methodology increasingly applied in monitoring and evaluation. It is a case study based approach used to analyse and compare multiple cases in complex settings. Through QCA, patterns can be identified across multiple cases to understand what makes change happen in some cases and not in others. QCA depends on the projects (or 'cases') having a ToC. This is important because the ToCs enable the evaluators to a) understand the change the project is seeking to bring about and b) to identify the key factors needed to bring about the change. In QCA, the factors are scored in each case study. Criteria are developed by the evaluator to assess whether the factor is considered present or absent in the case being examined. This can be done in a binary way, scoring 0 (factor is absent) or 1 (factor is present) or can be done on a scale 0, 0.33, 0.66, or 1. Once the factors are scored in each case, patterns can be identified, i.e. change happens in circumstances when X, Y, Z happens etc. If comparison occurs across multiple cases, computer software is often used to conduct the scoring.[76]

Strengths

  • QCA can help address why change is occurring in some cases and not in others
  • It can support learning in change processes, supporting understanding of what works and what doesn't and what may be replicated.
  • It can help validate a ToC.

Limitations

  • QCA generally requires a minimum number of cases, around 10, to be most useful in drawing out patterns.
  • It involves comparison between the cases implemented by the programme, i.e. between the Pathfinders, rather than comparison with a control group.
  • Information must be available on all factors to determine to what extent the factor is present or absent.
  • Setting up criteria to determine to what extent a factor can be considered present or absent can be complex, and unless done rigorously may become too subjective.
  • It is difficult to judge when it is best to do QCA. If done too early it may produce misleading results.[77]

Overall assessment:

While QCA is an interesting methodology and appropriate for the context of the Pathfinder programme in principle, the programme is not at a stage where the implementation of this approach would be useful. There are only 2 cases (Dundee and Glasgow); if there were more Pathfinders, which had more time to develop and produce outcomes this may be a useful approach to implement.

Option 2B Single Case Study Design

This involves researching a case as a 'singular interpretable entity', which focuses on providing an in-depth, empirically rich, explorative, descriptive and analytic understanding of a case. Data is examined for a project to determine how participants may have been affected by a project or intervention by examining context and evidence before the intervention, during and after the intervention. Case studies will still require analysis of multiple sources and within-case triangulation of sources and methods.

Strengths

  • They can provide in-depth, empirically rich and holistic understanding of a project and the underlying processes of change.
  • Can support learning to support experimentation, adaptation and improvement.
  • Can be used to obtain the direct experiences and the stories of change from the people involved in the Pathfinder

Limitations

  • Causality cannot be inferred.
  • Case studies can present issues around external validity, or generalizability. Case studies focus on the details of one particular case, and there is less firm evidence on whether the findings from this one case are generalisable.

Overall assessment:

Case studies can be very valuable in giving us an in-depth understanding of a particular project and the processes of change and providing stories of change from the individuals involved. However, they lack the rigour of a comparative case study approach or a randomised control trial.

Recommendation: A case study approach can be combined with a quasi-experimental trial if commissioned to provide the level of detail and understanding of the Pathfinders required to extract learning to feed into the next Tackling Child Poverty Plan.

Recommended Option 2B: Plan for implementation of a Single Case Study Approach

We outline below a proposed approach for implementing a case study approach to supplement any quasi-experimental trial commissioned.

Aim

To better understand how the Pathfinders are working, what has changed for families and what factors have made a difference.

Feasibility of conducting a case study

A case study relies primarily on qualitative data gathered by the evaluator / researcher during the implementation period of the Pathfinders. The main factor that will determine the success of the case study approach, then, is the degree to which the evaluators can interview families and staff who have been involved in the Pathfinder. The longer the period families have not been accessing the services of the Pathfinder, the more difficult it will be to engage them in the case study.

Recommended approach

If a case study approach is adopted it is recommended that a single case study design is adopted, with a case study developed for each Pathfinder allowing a deep dive into the context and experience of each Pathfinder and scope for cross comparison (for example, identification of common themes/patterns). Though there are limits to the external validity of these findings, the case studies will provide a potentially useful starting point for, and learning for other Pathfinders, particularly with respect to strategies and approaches adopted to influence systems change. It will also enable the evaluators to capture the lived reality of families participating in the Pathfinders.

For each Pathfinder, the case studies will involve qualitative interviews with a sample of families, project staff and national or local partners. It will also involve site visits to the Pathfinders to observe processes and activities and to conduct some of the interviews with project staff face-to-face.

Recommended Option 2B: Timelines and Costs

Timeline: Case studies with a focus on impact should be planned towards the end of the implementation period of the Pathfinders to allow time for impacts to accrue. Case studies with a focus on learning can be developed earlier in the timeline of the Pathfinders. It is anticipated the case studies will be developed over a four-month period, with the site visits occurring withing the first two months.

Costs: It is estimated 15 days preparation and field work and 10 days analysis and write up per Pathfinder at a total of 50 days' work @ average £825 / day + VAT = £41,250 + VAT. Subsistence and travel for the site visits will also need to be costed into the budget.

Option 3: Embedding ongoing monitoring, evaluation and learning by the Pathfinders.

This option involves providing added support to the Pathfinders to enable them to set up monitoring, evaluation and learning processes that will support an effective evaluation (quasi or theory-based). It will also enable the Pathfinders to maximise their impact on a local level by responding and adapting implementation as they learn, as well as enabling learning to influence policy and practice at a national level. Learning is also needed to support understanding of what works and how the approaches implemented by the Pathfinders might be replicated or scaled.

Strengths:

  • Improving monitoring processes ensures data is collected effectively and is relevant to the indicators being tracked in the Monitoring Framework. These systems will support evaluation by capturing baseline information.
  • Self-evaluation embedded in programme delivery or 'reflection' occurring at regular intervals ensures learning is continuously fed back into implementation, allowing a programme to improve, adapt and respond, a critical aspect of the Pathfinders.
  • The process of regular reflection and learning is critical for achieving systems change, as it can enable implementers to gain a better understanding of leverage points for influence e.g. regular reflection can identify key stakeholders that need to be engaged, key policy that can be influenced and other entry points for change. The focus would be on understanding the mechanisms of change, regularly reflecting on progress against outcomes, and identifying key leverage points for influence and action.

Limitations:

  • Robust monitoring systems need to be in place, aligned with monitoring frameworks and ToCs. This may require evaluation expertise in the early phase to assist the Pathfinders to set up effective data collection and monitoring processes to ensure that Pathfinders will have baseline information and will be able to track the longer term impacts on families after their engagement with the Pathfinder.
  • Self-evaluation and learning processes embedded in delivery are not a substitute for the planned impact evaluation. It would be an add on to support the Pathfinders to be ready for evaluation and to enhance their ability to have impact.
  • Self-evaluation and learning can be time and data intensive for individuals delivering a project. Additional resources in the form of a learning partner may be needed to ensure the best use is made of data and feedback in a timely way.

Options for ensuring learning is embedded in programme delivery

There are a number of different options to ensure learning is embedded in project delivery. These could include informal processes developed by the Pathfinder to regularly extract data and learning from its monitoring system and to create protected time to regularly reflect on progress with staff and stakeholders. Two other more formal approaches are outlined below.

Option 3A Developmental Evaluation

As described by Patton (2011), developmental evaluation occurs when evaluation becomes embedded in programme/project development and implementation. It is the on-going process of "facilitat[ing] systematic data-based reflection and decision making in the developmental process" and well suited to "guide adaptation of projects and programmes to emergent and dynamic realities in complex environments." Within developmental approaches, ongoing learning can be embedded in the evaluation, and performance indicators adapted as the programme develops. Developmental evaluation relies on an evaluator or someone with evaluation experience being embedded in the delivery team to reflect on data on an ongoing basis as opposed to an external evaluator conducting an evaluation at a specific point in time such as with formative or summative evaluation.

"CA [Contribution Analysis] is a useful framework for ongoing learning and development, given its flexibility in being used for strategic planning, ongoing monitoring by managers, in addition to formative, summative and developmental evaluations (Wimbush et al 2012)." This developmental approach to evaluation allows timely adjustments to be made in order to ensure improved results: "The process of planning, evaluating and acting makes this evaluation approach more dynamic and able to accommodate some of the complexity of interactions allowing for the creation of feedback loops within the system, creating a more likely chance of successful outcomes. Similar adjustments can be made if external factors change or have unanticipated consequences" (Morton et al 2019 p.11)."

Strengths

  • Supports learning and understanding of how a specific intervention works through the continuous process of reflection on data, supporting refinement and adaptation of the ToC
  • Enables an ongoing process of improvement in programme implementation and evaluation
  • Allows an intervention to be adaptive and responsive
  • Well-suited to supporting systems change in complex settings[78]

Limitations

  • Can be data intensive
  • Attribution cannot be claimed
  • The evaluator is part of the delivery team, though data and evidence from developmental evaluation may feed into other formative and summative evaluations, which would need to be commissioned separately.

Overall assessment:

Developmental evaluation holds value as an approach to support the Pathfinders to learn and improve their delivery processes. It fundamentally depends on robust monitoring frameworks being developed, enabling data and evidence to be reviewed at regular intervals allowing project teams to make judgements on how delivery should change. Data and evidence from developmental evaluation can feed into an independently commissioned impact assessment.

Recommendation: The principles of developmental evaluation are recommended to support the Pathfinders' ongoing ability to learn, adapt and improve. The approach can be adapted as a 'learning partner approach,' where the latter not only provides support to the Pathfinders to extract, share and communicate learning (see below) but also provides the upfront evaluation support required to enable the Pathfinders to develop the monitoring systems they need to be effective. A learning partner (see below), with evaluation experience, providing support in both aspects (setting up monitoring system and extracting learning) is therefore recommended.

Option 3A Learning Partner Approach

Learning partnerships are becoming increasingly common to support change projects or programmes operating in complex environments. A learning partnership takes the form of a relationship developed with a practitioner, consultant or academic (organisation) that seeks to facilitate a process of gathering and analysing data, embedding reflection and reflexivity and supporting experimentation, adaptation and improvement. This "is intended to help people, organisations reflect on their work and build understanding about themselves, the organisation, system, context and process (Lowe & French, 2019)."[79] A learning partnership will vary depending on the organization, project or programme.

However, the work of a learning partner can be described as involving three aspects: convening, conversing, and curating. 'Convening' spaces for sense-making, co-creation and reflection. 'Conversing'- involving developing relationships with project staff, partners and stakeholders to understand, facilitate, encourage and influence. 'Curating' and collating and analysing data to support reflection, learning and change processes. Activities can include coaching, workshops, facilitation or convening communities of practice or analysis of data, patterns and reporting to identify key learnings to support change processes.

Strengths

  • Supports learning and understanding of how a specific intervention works through the testing, refinement, and adaptation.
  • Enables an ongoing process of improvement in programme implementation and evaluation.
  • Allows an intervention to be adaptive and responsive.
  • Well-suited to supporting systems change in complex settings.

Limitations

  • Can be data intensive.
  • Can be influenced by their own biases.[80]
  • There is a risk that the learning partner is perceived as an 'outsider' and is not successful in accessing the relevant data or the relevant people to be effective in their role.

Overall assessment:

A learning partnership would be extremely valuable to the Pathfinders. There is a danger however, given current data sharing challenges between partners, the learning partner will fail to be as effective as it could be in a 'curating' role.

Recommendation: A learning partner is recommended where the role is strongly focused on supporting the Pathfinders with their evaluation processes, and on supporting the Pathfinders to undertake developmental evaluation, without needing to have an evaluator on their team. The role is conceived as focusing on the aspects of 'convening' and 'conversing' in order to support and empower the Pathfinders to engage in reflexive practice and to regularly engage with the data they are collecting to identify key leverage points, key stakeholders and make meaningful decisions that can help them to adapt and improve their processes. Ideally the learning partner would also support the Pathfinders to improve their monitoring processes and how they collect data.

Recommendation: Given the learning partner's advantage in understanding the Pathfinders and their work, it is also recommended that the learning partner supports the evaluation with the collection and analysis of data for the proposed case studies.

Recommended Option 3A + 3B combined: Plan to implement a learning approach

Aim

To implement a monitoring system, enhanced by a learning partnership, which enables the Pathfinders to collate and analyse relevant data that not only supports effective evaluation but also facilitates developmental evaluation – on-going reflection, learning and improvement. This will improve the impact of the Pathfinders. It will also improve SG's understanding of the Pathfinders, what works and how the approach might be replicated or scaled.

Instead of an evaluator embedded in the team supporting the Pathfinder to reflect and analyse data, we recommend the Pathfinders are supported in two functions: (1) to develop effective monitoring systems and (2) to make space for sense-making and reflection through a learning partnership. The developmental evaluation approach is adapted as a 'learning partner approach,' where the latter not only provides support to the Pathfinders to extract, share and communicate learning (see below) but also provides the upfront evaluation support required to enable the Pathfinders to develop the monitoring systems they need to be effective. A learning partner, with evaluation experience, providing support in both aspects (setting up monitoring system and extracting learning) is therefore recommended.

Feasibility of a learning approach

Given a key aim of the Pathfinders is to gather learning and evidence of what is working to support national efforts to reduce child poverty at scale, an approach that places emphasis on learning and facilitates learning is well-suited. However the Pathfinders' monitoring systems do not currently support regular reflection and learning and do not currently gather all the required data need to track progress against the Monitoring Framework or ToC.

Therefore, in order for a learning approach to be implemented well, the following elements are required:

  • Upfront support setting up the internal monitoring systems
    • The Pathfinders require additional up-front support to set up their internal monitoring systems. This may involve supporting Pathfinders to understand how to develop a monitoring system aligned to the ToC, and to understand how to develop the required data collection tools needed to effectively capture data.
    • In the ToC and Monitoring report we set out the improvements needed to the Pathfinders' monitoring systems to ensure they are strongly aligned to the co-developed ToCs and to facilitate developmental evaluation/ impact evaluation.
  • Learning Partner
    • A learning partner can support the Pathfinders to engage in reflexive dialogue and to empower them to regularly engage with the data they are collecting to identify key leverage points, key stakeholders and make meaningful decisions that can help them to adapt and improve their processes.This is especially important with respect to the systems change component of the projects. More detail on the Learning Partner approach is provided below.
    • A learning partner is recommended to support the Pathfinders in this process of setting up their monitoring systems to collect, analyse and reflect on data.

Recommended Approach

1) Set up monitoring systems to enable data to be collected against the ToC. Gaps in the current systems are outline in the ToC and Monitoring report.

2) Consider setting up a learning partnership for the Pathfinders to facilitate experimentation, adaptation and improvement to support a process of systems change.

Learning Partner approach

The aim of the learning partner is to support monitoring and evaluation in the Pathfinder and to convene forums to enable the Pathfinders to engage in sense-making, reflection and analysis of their data with each other.

Minimum requirements

  • The learning partner will facilitate Pathfinder partner meetings to support a collective approach to assessing progress against the ToC. The learning partner will also facilitate group sense-making, reflection and analysis of data. The learning partner may be responsible for survey development, and inputting data into additional data collection tools aimed at tracking progress against short-term and process outcomes in the ToCs (for example, the impact and systems change logs suggested in the ToC and Monitoring report.)
  • The learning partner will support the Pathfinders to report against the monitoring framework.

The learning partner may also

  • Support the Pathfinders to set up their monitoring systems to ensure they are aligned to the Monitoring Framework and the evaluation.
  • Support the evaluation (quasi-experimental trials) by collecting primary data and analysing data to develop case studies as described in the section above.

Recommended Option 3A + 3B combined: Timelines and Costs

Timeline: It is advised a learning partner is engaged as soon as possible, particularly if the learning partnership is developed to support the Pathfinders to improve their monitoring systems.

Costs: A budget for the learning partnership should be agreed depending on the boundaries of the work.

Recommendations

A theory-based evaluation based on contribution analysis is recommended and should be planned towards the end of 2024/25 to enable lessons to be fed into the Third Tackling Child Poverty Delivery Plan. This is not mutually exclusive to a quasi-experimental trial, if one is commissioned.

A case study approach should be adopted to supplement a quasi-experimental trial if recommendation (1) cannot be implemented.

Pathfinders need to improve their monitoring systems and data collection processes and ensure they are aligned to the indicators in the monitoring framework and to their ToCs to facilitate effective evaluation and learning. A learning partner with evaluation experience can support the Pathfinders to implement this.

A learning partnership should be considered to support the Pathfinders to engage in sense-making, reflection and analysis of data in order to help them to adapt and improve implementation and extract learning that may be useful to feed into future approaches to tackling child poverty.

Methods for evaluating value for money

Understanding value for money

Getting value for money from a policy intervention means making optimal use of resources in order to achieve desired outcomes. It does not mean that the best approach is the cheapest one, rather that the best approaches achieve a high level of impact for a given amount of input. The Department for International Development (now the Foreign, Commonwealth & Development Office) defined value for money as having four elements:[81]

Economy – inputs should be of suitable quality and quantity while not being unduly costly.

Efficiency – the delivery mechanism should produce the optimal level of output for the given inputs.

Effectiveness – the outputs should give rise to the intended outcomes.

Equity – the extent to which the beneficial impacts of a programme are distributed fairly.

In the context of Pathfinders, 'economy' relates to the costs of the respective Dundee and Glasgow programmes and the number of staff. In the case of the Dundee model, this may include the amount of time staff spend with individuals in their caseload; for the Glasgow model it also includes costs of direct interventions such as food parcels.

The efficiency of the Pathfinders would take into account the rate at which the services deliver immediate benefits for their users, such as entering a job, accessing financial support, or receiving housing support.

Effectiveness is then the extent to which these direct impacts lead to the overarching aims of increasing income and reducing child poverty.

Last, equity would consider whether the support the Pathfinders provide and the associated positive outcomes reached those who need it.

Value for money can be assessed through economic evaluation. In broad terms, economic evaluations capture the four parts of value for money by comparing the value of the economic benefits and public savings from an intervention against the costs. Evaluations of previous child poverty interventions, such as the Local Authority Child Poverty Innovation Pilot and the Troubled Families Evaluation, have used economic evaluations to evidence their impact.

For Pathfinders, there are likely to be two broad sources of benefits to account for and value:

Increased economic and social value from reductions in poverty. For example, a family may have increased their income or gained a qualification through training or education. This increase in value can also include non-financial factors such as improved communities.

Fiscal savings from reduced demand. If implemented successfully, Pathfinders should lead to reduced strain on education services, DWP benefits, the NHS and so on.

Within both of these wide categories there will be a number of more specific benefits. Below, and in Appendix 5, we set out more detail on the scope of who experiences benefits and what these are.

Approaches to evaluating value for money

Economic evaluations revolve around comparing the costs of an intervention with its benefits. There are several ways of doing this, and the first stage in an economic evaluation is to decide which approach is most suitable. The primary options are:[82]

Cost effectiveness analysis (CEA): Where there are multiple intervention options with the same end goal, CEA compares the cost per unit of each option. The unit in this case is a non-monetary measure of output. For example, in relation to Child Poverty Pathfinders, CEA may assess the cost per family removed from poverty. This approach is desirable in cases where there is a clear measurable output from the intervention, and where it is not possible to calculate the monetary value of benefits. CEA is most useful when comparing across multiple different options – a value of cost per output means little on its own, but when put in context against the same figure for comparable options, it can be used to rank the value for money in each case. Whereas cost benefit analysis (discussed below) requires assumptions and estimation about the value of the benefits from a programme, CEA is more transparent and relies solely on known measurements of costs and output. However, this simplicity comes with the trade-off of CEA providing little detail in terms of wider benefits outside of the single output measure, and how the outputs vary over time and at different scale.

Social cost benefit analysis (CBA): CBA assesses the total costs and benefits of an intervention to compare whether the benefits are greater than the costs. If the benefits outweigh the costs, then the intervention can be said to provide value for money. CBA includes not just the direct financial costs and benefits, but incorporates all relevant economic, social and environmental costs and benefits associated with the intervention. Unlike CEA, this approach is based entirely in monetary terms, and so a key challenge is to determine the pounds value of benefits which are often non-financial (such as improved wellbeing or health) as well as estimating the value of financial benefits for which there is not direct data (for example, the value of reduced rent arrears). Also unlike CEA, it is not just one output that is measured, but the value of all impacts associated with the intervention. For instance, whereas CEA may focus only on the number of families removed from poverty, CBA would seek to quantify the value of this reduction in poverty, as well as other intermediary and tangential outcomes. Outputs from CBA are the net present value (benefits minus costs) as well as the benefit-cost ratio (benefits divided by costs) of the intervention. In situations where there are multiple identifiable, measurable benefits which can be assigned a monetary value, CBA may be the best approach to use. CBA is an effective way of comparing the value for money of different options. However, it can also be used to assess a single intervention in isolation, as the results will indicate whether the net financial impact of the policy is positive or negative.

We do not recommend using CEA for the economic evaluation of Child Poverty Pathfinders; this is for two reasons. First, the Pathfinders' aims are to address a number of different issues which vary between families, meaning that there is not a single measure of output.[83] CEA takes a narrow view of value for money, which does not reflect the complexity of the Pathfinders. Second, CEA requires a comparison between different options in order to assess whether the cost per output is good or bad. While there are two Pathfinder models, the purpose of the evaluation is not to compare them and establish whether one should be retained and the other discontinued, but rather to understand their merits in their own rights and how they best serve the local community.

CEA has been used to evaluate similar programmes to the Pathfinders, such as the Local Authority Child Poverty Innovation Pilot. However, a lack of data on outputs and differences in data collection between each pilot resulted in the final report only being able to provide partial estimates on the cost-effectiveness of the programme.[84]

By contrast, a social CBA approach would be an effective way of evaluating the value for money of the Pathfinders. Because CBA is flexible in the number and types of benefits that are included, it is well-suited to assess the value for money of a programme such as the Pathfinders, where (i) the main output – child poverty – is difficult to measure, and (ii) there are many different sources and types of benefits which should be accounted for in order to reflect the true value for money. In setting out the type and size of the different benefits that arise, CBA would also provide a deeper understanding of how the respective Glasgow and Dundee models give rise to financial value, which can help to assess whether the models deliver what is expected and required. The detailed breakdown of costs and benefits in CBA would also make it possible to see the relative impact of different aspects of the Pathfinders, and assess the key contributors to costs / benefits and whether these can be focussed on more.

CBA has previously been employed successfully in relation to children and families research and evaluation. One recent study estimated the costs and benefits of an additional $1,000 allowance for families in the US.[85] This included benefits such as increased future earnings of children, improved health, and reduced crime; with the costs being increased education expenditure, administrative costs and increased tax burden. The impacts were quantified by triangulating existing research into the effect of a change in income on the different areas of benefits. The study showed that the annual benefits of the additional allowance would be nearly 10 times the cost. Another example of CBA being used in a child poverty context is in a study by Impact on Urban Health looking at the benefits of providing free school meals.[86] This similarly estimated benefits using external research and secondary data analysis. Benefits were identified using a ToC, and categorised into education and employment, health and nutrition, and school food economy impact pathways. It compared two different policy scenarios to illustrate the cost-benefit ratios of different levels of intervention.

CBA was also conducted as part of the Troubled Families Programme evaluation.[87] This used data from the 124,000 families who joined the programme in 2017/18. The CBA was based on outputs from the quasi-experiment aspect of the evaluation, which included the following outcomes: looked after children, children in need, adult convictions, child convictions, claimant status, and adult employment status. Only the outcomes that were statistically significant between the treatment and control groups in these models were included in the CBA. This highlights a key challenge to overcome in a CBA: how to determine the level of impact that can be attributed to the intervention. As the example here shows, one method would be to make use of the findings from other quantitative aspects of the evaluation which identify the statistical relationships. Other methods of establishing attribution include gathering information through surveys, direct qualitative feedback from families (as was done in the evaluation of the Local Authority Child Poverty Innovation Pilot) and collating various evidence sources through literature reviews.

Other limitations of CBA that have been highlighted include whether the quantitative data is robust enough to assign monetary value to the impacts. In its evaluation of the Child Poverty Strategy, the Welsh Government found that a CBA is impractical for evaluations where monetised impact data is not available.[88] This emphasises the importance of gathering robust financial proxies and applying the appropriate level of optimism bias, to ensure that the monetary value of benefits is not overstated. Equally, when using proxies it is also important to ensure that there is a clear justification for why the proxy reflects the impact being measured. While financial proxies are inherently an imperfect measure of value, we do not see this as a reason to avoid doing CBA, as when implemented properly they can still serve as an accurate representation of economic value. CEA, by contrast, would avoid much of this difficulty, as it does not require estimating the monetary value of all benefits.

Using a social CBA method to evaluate the Child Poverty Pathfinders would also be consistent with the approach taken in research by the Joseph Rowntree Foundation, which estimated the social cost of child poverty.[89] This is rooted in the idea that child poverty creates challenges in children's lives which in turn create government costs to intervene, and economic costs for children who cannot reach their full potential.

Given this approach, the next decision is to set out what costs and benefits are within the scope of the evaluation and should be accounted for. A general principle for determining this is to only include first-hand – or direct – impacts which arise due to the programme that is being evaluated. For example, in relation to the Pathfinders, a first-hand benefit that is within scope is the increase in income for families using the service; a second-hand benefit outwith the scope of evaluation would be the increase in spending in local areas that arises due to the aforementioned income rise.

In our evaluability assessment workshops, we gathered a longlist of beneficiaries from the programme, and what benefits each party experiences. This longlist was subsequently refined to establish which benefits are in or out of scope for the analysis. The workshops were also used to identify the main costs that need to be captured. These were categorised into costs coming from the direct funding of the Pathfinders, and in-kind costs. This categorisation of costs and benefits categorisation is provided in Appendix 5 at the end of this report. Because the Glasgow and Dundee models help people in slightly different ways, the exact benefits included may differ between the two, although there will also be some overlap. This will depend on what evidence is found and the available data in relation to impacts.

Quantifying the value of benefits in a CBA requires two main pieces of information: the number of people who experience the benefit, and the financial value of the benefit per person. The financial value generally relies on financial proxies from external sources (which are discussed more below). By contrast, measuring the number of beneficiaries requires internal data directly from the Pathfinder. The way in which this is measured, and the types of benefits that are ultimately included in the CBA, could vary depending on the methodology used in the impact evaluation. We have proposed two main options to evaluating the impact of the Pathfinders on families and child poverty: a quasi-experimental approach, or an impact assessment using quantitative and qualitative data mapped against the ToC. There are then two possible ways of measuring the number of people assigned to the CBA benefits – the first being an available option in both of the above impact evaluation methods, and the second applying only if a quasi-expeirmental approach is adopted.

The first approach would be to use the individual Pathfinder ToC to identify impact pathways, which illustrate what types of benefits accrue to whom. Next, each of the impact points identified is matched to a data source that can be used to measure the number of people who experience the benefit. The data audit carried out as part of the Monitoring and Evaluation framework development provides a key source for checking if there is existing data already collected by the Pathfinder. If there is no existing data source in relation to an identified benefit, then collecting this information can form part of the impact evaluation. If taking the impact assessment approach to the impact evaluation, this data collection / identification would likely be happening anyway as part of the impact assessment. That is, the purpose of the impact assessment is to collate evidence through data to demostrate the impact pathways identified in the ToC – many of the benefits included in the CBA would already be included in this process, so additional data would only be required for specific CBA benefits not captured in the impact assessment. If a quasi-experimental approach was adopted, then the data collection for the CBA would entail more of an additional step beyond the main impact evaluation.

For example, the ToCs (see appendix 1) identify that families will have increased income from employment – something which applies to both the Dundee and Glasgow services. This is an impact which has quantifiable benefits and so should be included in the CBA – it is also a benefit which was highlighted in our economic evaluation workshop longlist of benefits (see appendix 5). The number of people who experience this benefit will be the number who – as a result of the Pathfinder intervention – enter employment after previously being unemployed. The data audit does not identify a specific data source in the Dundee Pathfinder that would provide this figure. It is possible that the not-yet-developed Dundee Pathfinder monitoring Data Excel Spreadsheet or exit interviews would provide this; otherwise it would be a requirement of the impact evaluation to gather data on the numbers of people who entered employment. For Glasgow, the current main source of this information appears to be the exit interviews. However, the reliability of these depends on what proportion of service users take part in an exit interview. If only a small number are conducted, then a more focussed collection of data on job entries (and other similar impacts that are included in the CBA) should form part of the impact evaluation. For the full CBA, this process of impact identification and data matching should be repeated to reach a comprehensive set of benefits. In appendix 5, we have set out suggestions for how the number of beneficiaries for each of the longlist of benefits could be measured, and data sources for each of the Pathfinders that may contain this.

In order to include a benefit in a CBA there needs to be sufficient evidence to show that the intervention being assessed did in fact give rise to that benefit – i.e. the benefit needs to be demonstrably attributed to the intervention. If not using a quasi-experiment in the impact aspect of the evaluation, this attribution will not automatically come out of the impact evaluation. There are a handful of options to establish whether a benefit can reliably be attributed to the Pathfinders and be included in the CBA, which can be used in tandem for maximum robustness.

First, drawing on the findings from contribution analysis can show whether the Pathfinders contributed to the impact – this may not be 100% attribution, but can indicate the areas where the Pathfinders clearly contributed to impacts.

Second, drawing on qualitative evidence can be an effective way of bolstering quantitative data. For example, if the exit interviews, informal impact records, or case notes show that families have stated that they have experienced a particular benefit, then this can act as evidence to support inclusion of that benefit. Using qualitative information in this way can help to unpack what numbers alone cannot tell us, and to establish impacts which are not obvious from the quantitative data.

Third, combining the CBA with evidence from previous studies can help to support attribution of impacts. Reviewing CBAs (or more general impact evaluations) from other similar policies can demonstrate what benefits are proven to arise from interventions like the Pathfinders, and so this can support inclusion in the Pathfinders CBA.

Last, uncertainty in the benefits included can be mitigated by applying attribution and deadweight discounts in the CBA calculations. The attribution discount relates to the proportion of an impact that can be said to be caused by the Pathfinders. Deadweight is an adjustment to account for the amount of impact that would have occurred even if the Pathfinder had not been present. The level of attribution and deadweight discounts that are applied are usually based on an informed assumption. If there is little evidence to support this assumption, then it is appropriate to apply a larger discount to ensure benefits are not overstated. For the Pathfinders, these discounts could be informed by data sources such as the Dundee client spreadsheet or case notes, and the Glasgow customer service advisor forms, holistic needs assessment, or Glasgow Helps monitoring spreadsheet, which would indicate the starting point of users when they access the services, which in turn can reflect the extent to which any impact is due to the Pathfinder specifically.

The second approach to measuring the number of beneficiaries in a CBA would be possible only if a quasi-experimental approach was used in the impact evaluation. In this case, the quasi-experimental approach would need to test the impact on a range of different relevant outcomes. The results of this would then be applied to the CBA, with outcomes only being included if a statistically significant impact was found. For those which were statistically significant, the results of the quasi-experiment can then be used to calculate the number of people who experienced the benefit. This method was used in the CBA of the Troubled Families Programme. Here, propensity score matching was used to test for impacts in a number of outcomes, with the following found to be statistically significant and included in the CBA:

  • Number of adults claiming JSA
  • Number of looked after children
  • Number of juvenile crimes
  • Adult prison years
  • Juvenile prison years
  • Number of children on a child protection plan.

These would likely be relevant outcomes to explore should a quasi-experimental approach be used for the Pathfinders evaluation. Based on our benefits scoping so far, the following would also be applicable:

  • Employment
  • Income from employment
  • Income from social security benefits
  • Childcare costs
  • Qualifications and skills level
  • Mental wellbeing
  • Housing situation / arrears
  • Child poverty levels.

The advantages and disadvantages of the two approaches for measuring the number of beneficiaries (collecting quantitative data mapped against the ToC impact pathways versus adapting results from a quasi-experimental approach) are clear. The former method allows for a wider scope of benefits, being more flexible around what can be included. This would likely also allow for more benefits which are not directly related to families (e.g. wider social and public sector impacts). The downside to this option is that it has less certainty in whether the Pathfinder can truly be attributed to impacting that exact amount of people. By contrast, building on a quasi-experimental approach would have more limited scope as to what is included, because any benefits need to both (a) be measurable through the quasi-experimental method, and (b) show a statistically significant impact. The advantage here is that any benefits that are included are robust, and have demonstrably been brought about by the Pathfinder itself.

As noted above, as well as the number of beneficiaries, the other key data point for calculating benefits is the financial value of each benefit. In some specific cases, it may be possible to get direct financial data. For example, if quantifying the value of increased income from benefits / social security, data could be collected on what benefits families register for, and therefore the financial value of this. In the case of the Glasgow Pathfinder, this may be collected through the holistic needs assessment or exit interview data; and for Dundee the housing benefit and council tax reduction data may include this.

However, in the majority of cases, it is likely that actual financial data is not accessible and so instead a financial proxy is used. Proxy values should be used in cases where: (a) the benefit is financial, but data is not available: or (b) the benefit is non-financial, such as social, health, or environmental benefits. Financial proxies for social benefits have been estimated in previous studies – for example, the Greater Manchester Combined Authority CBA tool provides proxies for a range of social benefits which could be applied here.[90]

This source was used extensively in the CBA as part of the Troubled Families Programme evaluation. Other useful sources for financial proxies include HACT[91] and DWP's Social Cost Benefit Analysis Framework[92]. In addition, individual proxy values may have been estimated in academic research and so evidence reviews can be used to identify proxies not provided in the above databases. When using this approach, the CBA can be made more robust by identifying multiple sources which estimate the relevant financial value, and making a judgement over which is the most applicable / reliable, or alternatively taking an average. In appendix 6, we have provided some suggested financial proxies for use in a CBA of the Pathfinders; this is not exhaustive, and the actual proxies required will vary depending on which benefits are included in the final CBA. This contains proxies from the three sources mentioned above, as well as from Scottish Government colleagues' own bank of proxies which is under development.

Methods evaluating the processes of the Pathfinders

Understanding process evaluation

The process evaluation is distinct from the impact of the Pathfinders as it focusses on their internal operations, as opposed to the broader changes that occurred as a result. Key factors that this aspect of the evaluation will look at are:

  • The extent to which the programme is implemented as it was intended.
  • Whether the target groups have been reached.
  • The extent to which families found the service accessible.
  • The views of Pathfinder staff on the programme's effectiveness.

The process evaluation is closely related to the systems change element of the impact evaluation but has two key differences. First, the impact on systems change will assess the state of the system before and after the programme to establish the degree to which it has evolved from its baseline position. The process evaluation is based on what happens during the programme, and how smoothly and effectively it ran. Second, the focus of the process evaluation will be the internal operations within the Pathfinders themselves. On the other hand, systems change refers to both the Pathfinders and the broader family support system, looking at how Pathfinders have shifted the approach to tackling poverty at a macro scale.

Process evaluations seek to understand 'how' an output was achieved. However, looking at this in isolation – that is, without simultaneously considering 'what' was achieved – can be challenging. For this reason, process evaluations are often commissioned in conjunction with an impact evaluation. Therefore, we recommend that the evaluation of Pathfinder processes links closely to the impact evaluation aspects discussed above.

The Pathfinders target priority groups who are particularly vulnerable and often face multiple barriers and inequalities. For many, it is challenging even to access the services offered by the Pathfinders (as has been evidenced in the Dundee model, where many individuals do not even know such support services exist). Therefore, an important aspect of the process evaluation should be to investigate how well the Pathfinders overcome these barriers and overcome inequalities by making the services as accessible as possible to everyone. For instance, the Dundee approach involves directly targetting people with specific needs, and working with them closely on an individual basis to ensure they get the support they need. In some cases this can require repeated visits from the key worker before they attend the Brooksbank centre. The process evaluation could look into what this method does and does not provide for the supported families, and what the implications around fairness are for those who are not targetted. In the case of Glasgow, key considerations around equality include:

whether the level of information sharing and engagement is sufficient to ensure that all those who need the service are aware of it

whether the criteria to use the Pathfinder (i.e. simply having a Glasgow postcode) cuts out people on the fringes of the area, or crowds out people within it.

Approaches to process evaluation

The process evaluation asks different questions to the impact evaluation, however it will ultimately draw from the same ToCs, quantitative and qualitative methods discussed above. We expect that the nature of assessing processes means that this will primarily be qualitative-based research, however, drawing on quantitative evidence which informs the ToC will also be important.

The benefits and methods involved in qualitative research are discussed in more detail above in relation to impact assessment. Using a combination of these methods would be the best way of gathering different types of evidence on the Pathfinder processes from a range of different people involved. While the process evaluation should focus on working with staff, stakeholders and individuals involved from Scottish Government, there would also be value in gathering evidence from the families who use the Pathfinders. A well-functioning process should ultimately lead to a more effective service, and so views from families in relation to their experience of the service, and how useful it has been, can provide an alternative – and perhaps more objective – insight into how well the processes have worked.

How can learning be assessed and integrated?

The final aim of the evaluation ("learning") is designed to combine and articulate the findings from the previous four. Learning will help the Scottish Government to make future decisions about continuing and expanding the Pathfinders, with the added benefit of providing greater cohesion to the different aspects of the evaluation. Arguably, this aim is not standalone, but forms an implicit part of the others. However, we believe that it is of sufficient importance and magnitude that it warrants being separated out as an additional aim.

Within the evaluation learning, there will be two critical strands: learning on the features of the Pathfinder models and learning on how the Pathfinder approach can drive systems change. Learning around the Pathfinder model features will relate to what it is about the Dundee approach of intense key worker support, or the Glasgow no-wrong door approach, that works and what doesn't. This can inform what aspects of these models could or should be utilised elsewhere in the country or at a larger scale.

Distinct from this, it will also be important to take stock of learning around what it is about the general Pathfinder approach which can lead to systems change. This looks at the Pathfinders at a higher level and considers what it is about the principles underlying the Pathfinders that can drive systems change. For example, it may look at how the approach can create cultural shifts, or how different aspects of the system can be joined-up. As well as learning that can be reflected upon in a summative evaluation, learning can be integrated into the Pathfinders through a learning partner approach (in combination with a developmental evaluation) as discussed above. This embeds learning as the core of the evaluation, and allow for flexibility in the indicators used to measure success, making it a well-suited approach to assessing and integrating learning.

What is the best way of procuring an evaluation?

Timing

The table overleaf sets out our suggestions on the approximate timings of the evaluation. The timings depend both on the type of data required and how this is collected (for example, baseline data should be collected straightaway), as well as the nature of what is being evaluated

Proposed timing of evaluation

Evaluation aim: Impact on families, child poverty, and the system that supports them

Element of methodology

Quasi-experiment

  • Difference-in-difference for Dundee
  • Interrupted time series analysis for Glasgow
  • Supporting case studies

Suggested timing

Foundations for experiment should be laid out as early as possible.

For a feasibility study, this may mean beginning in mid-2023. Set up and data collection / sharing agreement would take approximately 6 months, meaning the feasibility study would require 6 months - a year for completion (i.e. completion early-mid-2024).

For the full study, this would then begin in 2024. Based on previous studies, an observation period of 18 months - 2 years would be needed before the impact is evaluated.

It follows that, in totality, a feasibility study followed by full roll-out would take approximately three years. Assuming the earliest this could be commissioned is late 2023, then this approach could be completed by mid-2026.

Element of methodology

Impact assessment using contribution analysis

  • Management and performance data analysis
  • Engagement with families, stakeholders, partners and Pathfinder staff
  • Longitudinal survey with Pathfinder staff

Suggested timing

A slight delay in the start of the contribution analysis would allow time for outcomes to be realised. Therefore, we recommend beginning in early 2024, with completion in 2025. This would align with the first year of the quasi-experimental studies, and provide results before that study is completed.

Engagement through interviews or focus groups should happen at the start as a way of setting a baseline, with follow-up engagement 6 months to a year after.

A longitudinal staff survey could capture data at three points in time, each 6 months apart.

Element of methodology

Embedding ongoing evaluation with a learning partner

  • Developmental evaluation

Suggested timing

It is advised a learning partner is engaged as soon as possible, particularly if the learning partnership is developed to support the Pathfinders to improve their monitoring systems.

Evaluation aim: Value for money

Element of methodology

Social cost benefit analysis

Suggested timing

To allow the benefits to be realised and relevant data to be collected through the impact evaluation, the CBA should take place at least 1 year after the Pathfinders began. If relying on results from a quasi-experimental study to identify the scale and scope of benefits, then the CBA should take place after that study is finalised.

Element of methodology

Longitudinal surveys with Pathfinder staff, stakeholders, and partners

Suggested timing

Initial roll-out as early as possible with follow-ups at regular intervals: we recommend every 6 months.

Evaluation aim: Process evaluation

Element of methodology

Interviews and focus groups with stakeholders and partners

Suggested timing

Initial engagement as early as possible with follow-ups at regular intervals: we recommend every 6 months and to coincide with the survey.

Some flexibility will be required as different aspects of the process will only be able to be evaluated at certain times.

Element of methodology

Combine findings from previous stages

Suggested timing

Once the above stages are substantial enough to combine and draw learnings.

Evaluation aim: Learning

Element of methodology

Hold group discussions or workshops with stakeholders and Scottish Government staff

Suggested timing

After the bulk of the evaluation has taken place or the initial phase of the Pathfinders is completed.

Commissioning

In terms of commissioning Phase 2 of the evaluation, it is recommended that:

The procurement process should be held in a two to three stages, with an expression of interest phase, including an information / market warming event for interested / invited potential contractors, a formal written response stage, and possibly a third interview stage.

The procurement should be led by the Scottish Government with input from both Pathfinders.

Glasgow and Dundee Pathfinders are evaluated separately but by the same organisation. For the evaluation of process and value for money, the same broad methodology could be used for both Pathfinder models. The impact evaluation methodology will need to account for the differences between the two models. For example, we have recommended that a quasi-experimental evaluation use the DiD approach for Dundee and ITSA for Glasgow. It may also be the case that Dundee is appropriate for a quasi-experiment whereas Glasgow is not. If using the impact-assessment approach, the specifics of how this is implemented would be confined within the respective Dundee and Glasgow ToCs.

There is only one lot for the whole evaluation, but that consortium / partnership offers are welcomed, as these would help to address the wide scope and multiple facets of the programme and evaluation.

We estimate budgets for the different aspects of the evaluation as follows:

  • Quasi-experimental feasibility studies: £75,000
  • Full quasi-experimental study: £225,000
  • Theory-based evaluation using contribution analysis: £90,000
  • Learning partner: to be determined depending on the boundaries of the work.
  • Supplementary case studies: £41,250
  • Social cost benefit analysis: £25,000-£35,000
  • Process evaluation: £35,000-£40,000

Contact

Email: socialresearch@gov.scot

Back to top