Scottish National Standardised Assessments: national report

Summary of outcomes at a national level on the Scottish National Standardised Assessments (SNSA) in the 2018 to 2019 academic year.

1 Introduction

1.1 What is SNSA?

In January 2016, the Scottish Government published The National Improvement Framework for Scottish Education (hereafter 'the Framework'). The Framework set out the Scottish Government's vision and priorities for Scotland's children and young people. It was developed to support high-quality learning and teaching – the core principle of Curriculum for Excellence (CfE). Over time, it was intended that the Framework would provide a level of robust, consistent and transparent data across Scotland to extend the understanding of what works, and drive improvements across all parts of the system.

To meet the aims of supporting high-quality learning and teaching for Scottish children and young people, it was determined that gathering data on children's progress at key points in their education, including data on differences between those from the least and most deprived areas, was essential. Improved data of this kind would support the planning of further interventions to ensure that all learners achieve as well as they can. Part of this information would be provided by SNSA.

The assessments have been available for use in publicly funded schools in Scotland since August 2017. They are administered to children and young people in Primary 1, Primary 4, Primary 7 and Secondary 3 (P1, P4, P7 and S3) across Scotland, once in each school year at any point in time. Reports to schools and teachers are provided as soon as a learner completes an assessment. Additional reports are available for local authorities. This national report presents a description of what SNSA set out to measure, and some findings from the second year of the programme.

Outcomes from Scottish National Standardised Assessments provide one source of evidence as part of a range of evidence to support teachers' professional judgement of children's and young people's progress in learning. SNSA have been developed with the Australian Council for Educational Research (ACER). ACER's approach to learning assessments is that they should assist in:

  • clarifying starting points for action
  • investigating details of learner learning and performance
  • monitoring improvements and evaluating educational interventions
  • motivating effort and encouraging self-monitoring
  • providing feedback to guide future action.[1]

The user reports provided for SNSA support a number of these points by providing teachers, school leaders and local authorities with diagnostic information about learners' strengths and areas of challenge that can be used to plan next steps in learning. Alongside other assessment evidence, the information reported in SNSA can also be used to inform teachers' professional judgement on achievement of CfE levels. A central aim of SNSA is also to provide information on the outcomes of Scottish children and young people in literacy and numeracy over time.

1.2 Key features of SNSA

The SNSA programme has a range of important and innovative features:

  • it is delivered online

Children and young people complete the assessments using a digital device: a desktop computer, laptop or tablet. The assessments are delivered online and, because all items (questions) are scored automatically, teachers can access their learners' reports as soon as an assessment is completed.

SNSA are designed to be administered on a range of devices, including desktop PCs, laptops and tablets, and delivery on the most commonly available browsers is supported. This flexibility in mode of delivery is designed to support administration of SNSA across a range of different classroom settings, enabling schools to choose the method of presentation that best suits them. An online tool to assess technical readiness is available. This tool can be applied in advance of the device being used for the assessments, to ensure that the assessments function as expected. With this flexibility of delivery, the content of the assessments, within the adaptive design model, remains consistent.

  • it is adaptive

The questions presented to children and young people vary according to how well they are performing on the questions they have answered so far. All learners begin an assessment with a set of questions of middle-level difficulty. If a learner does well on these, the next set of questions presented will be more challenging. If a learner is not succeeding on early questions, the items become easier – and so on, through the assessment. The adaptive nature of SNSA means that the experience for each learner is modified so that the assessment is neither too hard nor too easy but appropriate for their current level. The adaptive design also means there are increased opportunities to benefit from the diagnostic value of the assessment. An assessment is most useful as a formative tool when there are no 'floor' or 'ceiling' effects. A 'floor effect' occurs when an assessment is too hard, so it tells only what a learner cannot do. If this happens, it is impossible to see a starting point on which future learning can build. A 'ceiling effect' occurs if an assessment is too easy and a learner gets every question right. When this happens, it is impossible to judge the upper reach of their attainment and thus to help this learner make the next step. The adaptive design, when working well, enhances the learner's experience of the assessment and serves to support identification of where children and young people are in their learning development.

  • it has a carefully judged number of questions per assessment

Each assessment has from 30 to 36 scored questions, with the number of questions increasing from Primary 1 to Secondary 3. These numbers of questions allow coverage of different aspects of each of the assessed subject areas, without excessive time being required by any learner. On average, in the 2018 to 2019 academic year, children and young people completed each of the assessments within 30 to 40 minutes (less than 30 minutes for Primary 1 children). However, there is no time limit for completing SNSA and, where a teacher judges it necessary, a child or young person may take a break and come back to pick up the assessment where he or she left off. To note, with effect from the 2020 to 2021 academic year, there will be 27 scored questions in the P1 literacy assessment.

Each question in the assessments has been tested empirically to make sure it 'works'. Before being included in SNSA, every question has been presented to several hundred learners of a similar age and stage to the ultimate respondents, to ensure that it has sound measurement characteristics and will yield statistically consistent outcomes. In addition, every question has been reviewed and signed off by a panel of experts from within Education Scotland.

  • responses are scored objectively

The majority of questions in SNSA are in 'selected response' format, mostly multiple choice. The advantages of this format are both educational and technical. First and foremost, it is an advantage that all responses are marked consistently, so there is no question about the reliability and standardisation of the outcomes at the question level. A second advantage, in terms of curriculum, is that, because learners can complete questions relatively quickly, a wider range of curriculum content can be covered in a limited time than would be possible if children have to generate their responses. A third advantage is that the assessments can be marked instantly, allowing the allocation of assessment items of an appropriate difficulty level within the adaptive design. Additionally, reports can be accessed as soon as an assessment is completed, so teachers can use the formative feedback immediately. There is, of course, much to be learned about children's understanding and skills from other modes of assessment, from short written responses to essays or projects and performances. However, assessments using selected response formats serve the purposes of SNSA well in its role as one element in the wider array of assessments that teachers will use to evaluate children's and young people's learning.

Other features of the SNSA programme are specific to the Scottish education context.

  • it covers agreed elements of Curriculum for Excellence

The assessments have been constructed to align with CfE. For the academic year 2018 to 2019, the final version of the Benchmarks (published in June 2017) is used as the reference point for the assessments, along with CfE Experiences and Outcomes. The content areas covered are described in more detail in the sections of this report dedicated to numeracy, reading and writing.

  • it has a flexible delivery model

The flexible delivery model is intended to allow children and young people to be assessed at any time in the school year that is judged suitable for the school, class and individual learner. A consequence of the flexible timing is that, when interpreting the outcomes of the assessment at individual, class, school, local authority or national level, the point in the school year that the assessment was taken needs to be taken into account.

There is clear evidence from the norming studies conducted during the academic year 2017 to 2018, in November and March, and from the whole year's attainment levels per stage, that children's and young people's outcomes – their literacy and numeracy skills, knowledge and understanding – develop progressively, on average, over the 10 (effective) months of an academic year. Amongst the stages presenting for SNSA, children in Primary 1 showed a marked increase in outcomes in both literacy and numeracy: this can be seen when comparing outcomes from 2017 (August to December) with those from 2018 (January onwards). The same pattern was observed for P4, P7 and S3, across all subject areas, but with diminishing increases in performance in 2018 for each successive stage. Within each stage, the rate of improvement between the first half and second half of the 2018 to 2019 academic year was similar, regardless of subject area.

While the findings described above might be as expected, they also constitute a positive outcome, confirmed empirically with SNSA data. However, given the possibility of administering SNSA throughout the school year, outcomes from all learners should be interpreted with some caution when making any comparative judgements about individuals or groups. Each learner is presented only once and, because the timing of SNSA was determined locally (except for the norming studies that took place in 2017 to 2018), it cannot be assumed that the profile of children and young people who presented in the first half of the school year was the same as that of those who presented in the second half. For example, it is possible that teachers chose which learners should sit the assessment based on their judgement of their learning progress.

  • it is designed to be accessible to all learners

To support learners when completing the assessments, the system is designed to be compatible with a range of assistive devices, so that learners can use the devices with which they are familiar from everyday use in the classroom, including software and devices such as text readers, screen readers and switches. In the case of screen readers, the assessments have been developed to include alternative text descriptions of images, charts and graphs that are integral to answering a question. Detailed guidance is available for teachers in relation to additional support needs (ASN) and English as an additional language (EAL). The information gathered from across the school year, on which the analysis within this national report is based, includes data from learners with ASN and EAL.

1.3 Reporting SNSA outcomes

In the academic year 2018 to 2019, six capacity bands were used in reporting the outcomes of SNSA to schools and local authorities, and they are also used in this report.

1.3.1 Reporting on learner outcomes

The reports available to schools and local authorities for SNSA 2018 to 2019 provided diagnostic information about each question presented to an individual or group of children or young people. This diagnostic information showed, for each question, the organiser to which each question belonged, the skills, knowledge and understanding assessed and the question's difficulty, as well as the individual's or group's outcomes against the question. This diagnostic information provides one piece of evidence to help the education profession identify areas of strength or challenge at the individual learner level or for groups.

Another key feature of the reports for schools and local authorities was information about learners' overall outcomes. Each stage's outcomes were reported in six bands. The outcomes of learners who achieved only a small degree of success on the assessment was reported in the lower bands. Similarly, the outcomes of learners who achieved a substantial degree of success on the assessment were reported in the top bands. These bands were related to regions of learner outcomes on the assessment that were specific for each subject area and stage, and each region for each of the eleven SNSA had a corresponding description unique to that assessment. These descriptions were based on a summary of the skills, knowledge and understanding assessed in the questions included in this assessment in the academic year 2018 to 2019, which, in turn, were aligned with the Benchmarks. The region descriptions for each assessment and stage are shown in Appendix 5: Band descriptions from the 2018 to 2019 individual reports.

On an individual report, the learner's outcomes are located against these descriptions to show the kinds of skills, knowledge and understanding he or she demonstrated in the particular assessment.

The bands have a specific and different meaning for each of the assessments, according to subject area and stage. Accordingly, the dot on an individual's report, locating the learner's outcomes, shows what kinds of skills, knowledge and understanding he or she demonstrated in the particular assessment, rather than any fixed judgement about the learner's aptitude.

For the 2018 to 2019 academic year, the newly established bands corresponding to the SNSA long scale allow comparisons across stages in terms of proportions of learners with outcomes at each band.

The outcomes on the assessment of an individual, a class or a school are intended as one piece of evidence – a fair and objective piece of evidence – in an evaluation of learners' progress in literacy and numeracy. The holistic outcome on the assessment is intended to be used by teachers to corroborate or, sometimes, to raise questions about, other reference points in their overall assessment of a learner's progress.

Figure 1 shows an example extract from an individual report (for a fictitious learner) for the Primary 7 Numeracy assessment for 2018 to 2019. To the left is a scale labelled 6–11, accompanied by the band description text referred to above, for each of the bands. The easiest content is summarised in the paragraph at the bottom and the most difficult summarised in the paragraph at the top.

Figure 1: Example page from an Individual report

Figure 1: Example page from an Individual report

1.3.2 Reporting on question difficulty

Just as each learner's overall outcome was expressed as a capacity band, each question in the assessment was also categorised as belonging to a certain band of difficulty. A question categorised as having the lower band difficulty in SNSA was one that learners of this age and stage generally tended to be more likely to answer correctly. A question categorised as having the middle bands' difficulty was one that fewer learners were able to answer correctly. A question categorised as having the top bands' difficulty was one that relatively few of the learners were able to answer correctly. The ratings of question difficulty appeared next to a brief description of each question on reports to schools, to support a diagnostic interpretation of the challenge of the questions presented to the individual child or young person.

More detail about the content of the assessments is provided in the subject area sections of this report.

1.4 The second year of SNSA

The 2018 to 2019 academic year was the second year of implementation for SNSA, and 577,385 assessments were completed across Scotland over the course of the year. This number is equivalent to about 93.4% of the possible maximum number of assessments available for children and young people in P1, P4, P7 and S3, which is a small decrease on the 2017 to 2018 academic year.

The metric on which SNSA was reported in the 2017 to 2018 academic year was not standardised for the Scottish population. A broad categorisation of attainment as high, medium or low was used in this first year, based on data from international contexts. A more refined scale for each subject was developed towards the end of the first year of assessments. The scale was standardised by drawing on the outcomes from two representative samples of Scottish children and young people presenting for SNSA, assessed in the first and second halves of the 2017 to 2018 academic year. An equating study was conducted with the groups P2, P3, P5, P6, S1 and S2, the stages in between those that form part of SNSA (P1, P4, P7 and S3), in order to equate all SNSA years onto a single scale. These new 'SNSA long scales' have been used for reporting from August 2018 onwards.

The Scottish Government's policy and practice of continuous improvement applies not only to educational attainment but also to the SNSA programme itself. Enhancements to content, reporting, the system delivering the assessments and the professional training courses accompanying the assessments were introduced during the 2018 to 2019 academic year, and further improvements will be implemented over the coming years. An example of this is the 'Outcomes by Academic Year' report (OBAY), which will be introduced during the 2019 to 2020 academic year. This is an additional tool that will support schools and local authorities in considering children and young people's outcomes by presenting results for different academic years alongside each other.



Back to top