Attainment Scotland Fund evaluation: headteacher survey report 2019

This report presents findings from a recent survey of headteachers of schools in receipt of support from the Attainment Scotland Fund (ASF). This is the fourth survey of headteachers, previous surveys having been conducted in 2016, 2017 and 2018.

4. Use of data and evaluation

4.1. This section summarises survey findings on schools' use of data and evaluation in relation to ASF supported approaches to closing the poverty-related attainment gap. The survey asked about the extent to which headteachers feel confident using data in this way, use of evidence in evaluating approaches, and the extent to which receipt of ASF support has impacted on their skills.

4.2. The great majority of headteachers felt confident using data and evidence to inform development of their approach; 93% indicated this which represented a nine-point increase from 84% in 2017 (90% in 2018).

Using data to develop approaches
Bar chart showing headteacher responses (agree/disagree scale) to statement ‘I feel confident using evidence to inform the development of approaches’

4.3. A large majority (90%) also indicated they always use evidence to measure the impact of their interventions and approaches. This was consistent with findings of the 2018 survey. Headteachers were less positive about their confidence in selecting the most appropriate measures to evidence the impact of their approaches; 77% agreed that they felt confident doing this; 15% 'strongly' agreed. It was notable that Schools Programme respondents were more positive than others across both measures.

Using data to measure impact
Bar chart showing headteacher responses (agree/disagree scale) to statement ‘I always use available evidence to measure the extent to which interventions are having a desired impact’

4.4. Most respondents felt that their skills and knowledge in using data for planning, evaluation and improvement had significantly improved through the Fund. Around 2 in 3 (66%) respondents indicated this, and fewer than 1 in 10 disagreed. The proportion of headteachers indicating this has fluctuated from year to year with no consistent upward or downward trend (60% in 2018 and 69% in 2017). As was found in relation to headteachers' confidence in using evidence, Schools Programme respondents were typically more positive than others on the extent to which their skills and knowledge had improved.

Impact of Fund on ability to use data for planning, evaluation and improvement
Bar chart showing headteacher responses (agree/disagree scale) to statement ‘Through the fund I feel my skills/knowledge of how to use data for teaching, planning, evaluation and improvement at a school level have been significantly improved’

4.5. A great majority of schools (95%) had an evaluation plan in place to measure the impact of ASF-supported approaches. This finding was consistent across key respondent groups.

4.6. The small minority of respondents (around 1 in 20) indicating that their school did not have an evaluation plan in place were asked about the reasons for this. These schools referred to changes of approach and indicators requiring production of a new evaluation plan, change of leadership or staffing constraints delaying production of a current evaluation plan, difficulty identifying success measures for the approaches being used, and schools with a relatively small PEF allocation using qualitative evaluation.

4.7. Follow-up qualitative feedback also highlighted aspects of schools' practice in use of data to evaluate and measure impact.[9] This included use of data tools such as Insight in developing approaches and assessing impact for specific pupil groups, alongside broader measures looking at the BGE stages and regular collation of other evidence such as participation rates, attendance, and progress through specific ASF programmes.

4.8. Follow-up participants also described this increased use of data as part of a wider change of culture and approach for schools. ASF was seen as having provided schools with an opportunity to reflect on and change their practice, building a robust evidence base to support these changes and measure their impact.

"The National Improvement Framework has changed the approach in schools, in focusing on evidence-based decision making and created a culture where we can challenge traditional approaches. Increased use of data has given us the confidence to innovate, knowing that there is an evidence base for the changes we are making."

4.9. Qualitative feedback also highlighted challenges around the evaluation of impact for specific interventions. Schools noted the difficulty in evidencing a direct causal link between a specific intervention and positive impacts for pupils, given the wide range of factors that can influence attainment, attendance and other indicators of progress. This included examples of the specific challenges faced by small schools, where evidencing individual stories and progress were described as a more appropriate approach to evaluation.

"Evaluating impact on attainment is quite hard as a whole range of factors affect attainment…it is hard to isolate one intervention as the reason for success. Health and wellbeing is also difficult to evaluate, beyond numbers engaging with programmes and 'soft' indicators via surveys."

4.10. Despite these challenges in evaluating impact, qualitative feedback provided examples of schools having improved their understanding and skills in use of data to inform approaches and measure impact.

"Over time we have expanded our data gathering to include participation rates, attendance at support classes, attainment from literacy programmes. Support in use of Insight has been significant in developing a deeper understanding of how this sort of data can inform our approach. Reflecting on the range of evidence has resulted in our stopping or adapting interventions which we had previously felt would be valuable."



Back to top