Evaluation for policy makers - A straightforward guide

Evaluation for policy makers. A straightforward, user-friendly and practical guide to evaluation in the policy making cycle.

Chapter 3: How do I use evaluation effectively?

Finding an analyst

In the Scottish Government, there are three analytical professions; researchers, statisticians and economists who can help policy to use evaluation. Although researchers tend to lead on evaluation, it is not the sole domain of researchers so we've referred to 'analysts' throughout the remainder of the guidance.

Important principles for policy makers

Although analysts are here to help, the effective use of evaluation relies on important principles that policy officials should adhere to - these are set out below.

Value the use of evidence in policy making

  • Evidence-based policies are more likely to work. Use strong and consistent evidence from published evaluations to inform policy ideas and policy development.
  • Start with the assumption that you don't know if a new policy will be effective until you have evidence.
  • Conduct evaluations that help to improve policies, as well as evaluations that hold us accountable.
  • Use the evaluation results to improve policy rather than keeping the report on a shelf.
  • Appreciate the importance of investing in continuous data collection (monitoring) in order to underpin subsequent evaluations. In most cases it will not be possible to conduct an outcome or impact evaluation without monitoring or baseline data.
  • Good evaluations are not only about numbers, especially if sample sizes are too small for numbers to make sense. Sometimes case studies can provide valuable insights into how policies are working and can be just as useful, or more useful, than costly and onerous data gathering exercises.

Work closely with your analysts and stakeholders from policy development stage

  • Work with analytical colleagues (especially researchers) to make sure evaluation is considered at the earliest stage in policy development. The way the policy is structured and delivered and the way the data is collected by front line staff will make a huge difference to the quality of evaluation that can be done later, so it's always a good idea to check in with analysts from the very beginning of an idea.
  • Have a clear understanding of what a policy is supposed to achieve, and share this with analysts.
  • Decide on the questions you need to ask to determine if a policy is working, and consult with analysts about which evaluation methods will answer your questions.
  • Support stakeholders to do self-evaluation, or ensure there is reliable data available for internal or contractor evaluations. Ideally provide adequate funds and/or support so stakeholders can conduct robust evaluations on our behalf.
  • If you have to rely on stakeholders to recruit people into a pilot so it can be evaluated, then secure buy-in early on as the pilot recruitment process can take a considerable amount of time.

Balance spend with the need for robust evidence

  • Good evaluations need resources to carry out. Think about whether evaluation costs can be built into budgets, and consider any financial implications for stakeholders if they are required to collect data at local level.
  • Ensure that the scale and spend on evaluations is proportionate to the scale and spend on the policy. Analysts can help to give you an idea of the costs of different approaches, and to judge what is worth investing in based on the size and importance of the project.
  • Build in more rigorous evaluations when the risk of policy failure has serious social and economic consequences.
  • Using data already available (e.g. national surveys or administrative data), or supporting front-line staff to collect priority monitoring data on a continuous basis, can reduce the money you have to spend on full 'snap shot' evaluations.

Top Tip: A good policy is evidence-based from the start

Evidence-based policies are more likely to work so use credible evidence to inform policy ideas and formulation. Ask analysts to look at the evidence-base or conduct a literature review. You can also use the various literature search functions to look for evidence on the topic of interest. The collection of robust published evaluations on your area of interest is sometimes called the 'what works' evidence base.

Issues to discuss with analysts

It is crucial to discuss evaluation with analysts during the early stages of policy development. Talking through the following points will help to get the planning process started:

1. Is there published, credible evidence that will inform policy ideas and the development of the policy?

2. Why is an evaluation required, who is it for, what do you want to know and when is it required?

3. What are your evaluation requirements at policy development stage (when policy objectives are being set)?

4. The rationale for the policy - what current issues or problems should it address? What impact is it supposed to have and on whom?

5. A logic model, so it is clear how policy objectives should be achieved, and how your policy should contribute to longer term outcomes and wider policy goals.

6. The key questions that you want answered about the policy.

7. The data that might be required to answer the questions and whether it is available.

8. The funds available for evaluation.

9. An evaluation approach (e.g. commissioning, internal, self-evaluation).

10. When you need the results (the timetable).

11. How will the messages from the evaluation be shared / disseminated and who with?

Don't worry if you don't know the answers to all of these questions right away, but giving them as much thought as possible in advance of discussing with analysts will make your initial discussion more productive.

Top Tip: A good evaluation considers the interests of its audience

Be clear who the evaluation is for and what their particular interests are. This will determine the focus of the evaluation and the way you communicate the findings.

For example, practitioners might be more interested in evaluations that look at good practice and how findings can help them improve. They may also prefer a practical paper that focusses on practice recommendations rather than a long 'academic' report.

On the other hand, Ministers might be more interested in outcomes to see if the policy has achieved its aim. Also, think about how findings will be presented and shared with different audiences, so they have maximum impact.


Email: Social Research

Back to top