Using the SNSA Data
“We need protected time in schools to go through the SNSA results and discuss them.”
“It’s new so I think it’ll probably take a while to get to know what it can really do. We’re getting better at data-use in primary schools but we’re not there yet.”
“It was useful for us in identifying areas where there wasn’t a depth of knowledge across the whole class or there were significant gaps in an individual’s learning.”
“The data training focused on generic principles and case studies but it didn’t help me with looking at my own class.”
“When we saw how the SNSA items linked to the benchmarks and outcomes, that’s when I saw how I could use it in my planning and with individual children. It was a ‘light-bulb moment’ for me.”
“I’d like a chance to talk some more, and in more detail, about some of the results for my children.”
“In our school we discussed our SNSA in our normal progression and planning meetings. Often it confirmed things I knew but there were a few surprises, which I’m checking-out in other ways.”
Forum members recognise that the SNSA is a tool and that learning to make good and efficient use of the data it generates will take time and require staff development. The current staff development strategy has insufficient ‘reach’ across the profession. Its content emphasises generic data use and assessment principles and it needs to be more specific and tailored to support teachers in exploring their own class data and to support senior leaders in exploring the data of the school. It should offer specific examples/advice on using the data to inform classroom planning and practice. A mix of ‘bite-size’ online staff development videos and bookable opportunities for teachers to discuss with experts any questions raised by their data would be highly supportive.
The Forum members agree that the SNSA question items are attuned to key Curriculum for Excellence benchmarks and outcomes. However, the match between these and specific SNSA items is not clearly signalled in the reporting and recording formats for SNSA. Supplementing the reporting and recording formats to make this link explicit would enable teachers to verify SNSA data with their own observational and classwork data, and thus inform professional conversations about the curriculum, children’s achievement and teacher planning.
Forum members recognise that the SNSAs do not, and cannot, offer data for every benchmark, experience or outcome, and that some (for example those on Problem Solving, Engagement) are assessed in other ways. Recommendations for ways to triangulate data should be trialled to ensure that any formats and processes suggested are time-efficient and sustainable for busy staff.
Forum members recognise that young children’s learning is complex. Close-up, it is characterized by plateaus and leaps in understanding rather than by a continuous line of improvement. Young children bring different experiences to school and follow different paths to a common outcome. An inexperienced learner is neither a ‘low ability’ nor a ‘low capacity’ learner. Early Level teachers must be especially flexible and work towards a broad horizon by creating rich and tailored learning environments. They have multiple goals for a single session, offer responsive teaching, precise explanations, interesting, contextualized activities and link learning to children’s out-of-school lives. They are well-planned but also able to ‘seize the teachable moment’. Information from the SNSA needs to be woven into this learning tapestry, rather than sit outside it.
Given this, the SNSA offers a useful ‘snapshot’ of aspects of numeracy and literacy learning but its conditions of use need to be clearly articulated to prevent data misinterpretation or misuse. To ensure that the SNSA data is as robust as it can be, we need to know the kinds of support, materials and conditions provided and the extent to which they make a difference to children’s outcomes. This would enable teachers and school leadership teams to have a common understanding of what the scores mean. The SNSA provides a facility to track children’s progress over time but its predictive capacity is unknown and should not be assumed. A child with a low outcome in P1 may do very well in P4 as long as they are offered rich learning opportunities, rather than entered into a ‘simplified’ or ‘skills-driven’ curriculum that diminishes their curiosity and agency. In most schools, sample sizes are too small to reliably compare children across classes or schools. All educators (teachers, schools, local authorities, Scottish Government advisors and Education Scotland) have a professional responsibility to ensure that their systems do not overplay the reliability or predictive capacity of SNSA, or any other data. A negotiated and voluntary ‘Code of Practice’ with a clear processes to ensure that educators at all levels understand the power and the limitations of data and enact good data-use practices would underline this. Such understanding could help to prevent SNSA data becoming high-stakes.
Whilst some class teachers had productive conversations with other teachers and their school’s senior leadership team to analyse the SNSA data for their class, others were still awaiting an opportunity to do this. All class teachers need multiple opportunities to discuss the kinds of insights they gain from SNSA and other data if they are to become skilled at verifying, integrating, interpreting and acting on data derived from many different sources. Schools should ensure time within working time agreements, development days and/or regular scheduled progression and planning meetings for this.
The Scottish Government should ensure:
4a Training opportunities are available to help practitioners explore the nature of SNSA data, its links to Curriculum for Excellence and the information it can generate about the learning of both individual children and groups of children. [Responsibility: Scottish Government]
4b Clear signals in the SNSA reporting and recording formats to show which items contribute to professional judgements about which experiences, outcomes and benchmarks. This will widen understanding of what SNSA does and doesn’t do. [Responsibility: Scottish Government, Education Scotland]
4c The publication of case studies and exemplar material showing how educators use the SNSA data and triangulate it with other assessment evidence to make robust, holistic judgements and detailed analyses of a child’s learning. [Responsibility: Scottish Government, Education Scotland]
4d Online staff development materials showing effective and time-efficient use of SNSA data for analytical and responsive teaching, class planning, school management and professional evaluation purposes. [Responsibility: Scottish Government, Education Scotland]
4e The publication of technical reports on SNSA data and on teacher judgement data to assist educators, researchers, HMIE and local authority staff in making educationally sound decisions about how to respond to children’s needs, and not overplay the reliability or predictive capacity of the standardised assessment data. [Responsibility: Scottish Government]
4f Detailed measures are in place to avoid the SNSA becoming ‘high stakes’ for children or for educators. This might include creating a ‘Code of Practice’ for data-use that outlines responsible, ethical, attainment discussions and decision-making processes, with systemic checks and balances that prevent misuse. [Responsibility: Scottish Government, Education Scotland]
4g That the teaching unions and school/local authority staff work to establish a forum for educators to debate ethical data-use, to discuss any concerns about data-use that may breach the Code of Practice, and processes that allow them to raise concerns with those who can act on them. [Responsibility: Scottish Government, Education Scotland, Local Authorities, Schools, Practitioners, Unions]