Designing and Evaluating Behaviour Change Interventions

Easy-to-use guidance on designing and evaluating any behaviour change intervention using the 5-step approach

AN UPDATED VERSION OF THIS GUIDANCE IS AVAILABLE HERE http://www.gov.scot/Publications/2016/05/1967


Guidance for service providers, funders and commissioners (Summary version)

Step 5: Evaluate the logic model

Analyse your data to evaluate the project

Once you’ve collected some or all of your data you can use it to analyse whether or not your model is working as predicted. Analysis is not just a case of describing your data. You need to address the following questions:

  1. What does the data tell you?
  2. Why are you seeing these results (it could be because of your activities or external factors)?
  3. What are you going do about this? How can you improve the outcomes?

Nb. Although you should definitely carry out this process at the end of your project, earlier interim analysis and evaluation is also highly valuable in order to identify problems and improve your service on an on-going basis.

Testing the logic model - What does the data tell you?

Did the project work as it should have? The data you’ve collected will help to tell you whether your model worked as predicted, at each stage of the model. The following are examples of questions you might now be able to answer.

 

Inputs

Inputs

  • Which aspects of the service were/were not evidence based?
  • How much money was spent on activities? Was it sufficient?
  • How many staff were employed and at what cost?
  • What was staff/user ratio?
  • What did the staff do?
  • How many staff were trained?
  • What was the training?
  • Were there enough staff to deliver the activities as planned?
  • What other resources were required?

 

Outputs

Outputs

  • Who were the target group and was the intended target group reached?
  • What was the size of the target group/ their characteristics?
  • What were the activities/content?
  • How many participants were recruited? How successful were recruitment procedures?
  • How many of the target group participated, how many completed and how many dropped out?
  • How many sessions were held?
  • How long was an average session?
  • Did staff have the right skillset to deliver the content?

 

Outcomes

Outcomes

  • How many improved or made progress/did not improve or make progress?
  • What were the characteristics of the users who made progress?
  • What were the characteristics of the users who did not make progress?
  • What type of progress was make e.g. skills, learning?
  • Did users achieving short-term outcomes go on to achieve longer-term outcomes?

Explaining outcomes - Assessing contribution

Given the complexity of the social world, it is very unlikely that any single project can make a difference to people’s behaviour on its own. Where change is evidenced in users (both positive and negative), it is likely that there are multiple causes for this and your project will only be a part of this.

Without using a randomised control trial (which as we have said is often impractical), it is very difficult to really measure the contribution of a single project. However, we can get a broad sense of the relative importance of the project and how it might have contributed to change, in conjunction with other influences.

There are two key ways of doing this:
1. Subjective views on contribution
2. Identifying potential outside influences

Subjective views on contribution

Users, staff and other stakeholders are valuable sources of evidence in order to assess the relative contribution of your project to observed changes in users, in relation to other influences. You can:

1) Ask users whether they received other forms of support or influences on their behaviour?
2) Ask users to rate the extent to which each form of help contributed to their success, for example, did they say it was the project, their family, friends, another intervention or their own desire to succeed?
3) Ask others who know the users (e.g. family, teachers, social workers) to rate the relative influence of the project on observed changes.

Limitation!
Asking users and staff to judge the influence of a project runs the risk of ‘self-serving bias’. This is the well-established tendency for people to take the credit for success and underplay external factors. One way to limit this tendency is to tell staff, users and other participants that you will be asking others to also assess the contribution of the project. Be honest about this limitation in your evaluation reports.

Identifying potential outside influences

By thinking about other potential influences, outside of your project, which might also have influenced behaviour change, you can put your own evidence into context.

Having identified potential influences, you may then be able to exclude or acknowledge whether they actually influenced your own users.

For example, in relation to the project on young women’s physical activity, potential influences you might consider are:

  • The weather – Unusually good or poor weather might have encouraged participation in the project and/or other kinds of physical activity.
  • Local facilities – The opening or closure of sports and leisure facilities might have encouraged or discouraged physical activity.
  • Economic conditions – Changes in employment or income levels for families could impact on user participation in the project and outside forms of physical activity (even if free – travel costs may impact).

What can you do to improve?

The crucial next step in the evaluation process is to use your explanations of outcomes in order to improve your model.

  • Can you address any issues at the input stage (e.g. issues with staff training or resources)?
  • Should you extend activities which appear to have been successful?
  • Is it best to stop or redesign activities which the data suggests are ineffective?
  • Can you improve the model to better target groups with negative outcomes?
  • Can you do anything to address external factors which have negatively impacted (e.g. provide transport)?

Contact

Email: Catherine Bisset

Back to top