3.1 The scope of the reading/literacy assessment
3.1.1 Reading and literacy for P1, P4, P7 and S3
For P4, P7 and S3, the assessments of reading and writing were delivered separately, while for P1, children were presented with a single assessment combining elements of reading and writing. This is referred to as the P1 literacy assessment. There were two reasons for combining reading and writing at P1. First, literacy skills tend to be quite integrated at this early stage of development (and may be referred to as ‘precursor’ or ‘component’ literacy skills). Secondly, a combined literacy assessment reduced the burden of the assessment, which was an important consideration for the very young children in this year group. The P1 literacy assessment was scaled with the reading assessments for higher stages, using the same set of curriculum organisers, and is therefore discussed in this section in conjunction with the results of the P4, P7 and S3 reading assessments.
The P1 literacy assessment comprised both stand-alone questions and units: groups of questions focusing on a single stimulus text. At P4, P7 and S3, all the reading questions were grouped into units of four or five questions, to economise on the reading load. Using this unit structure, questions of differing difficulty and covering different organisers could be asked with reference to the same text.
3.1.2 Alignment with Curriculum for Excellence
In the SNSA academic year 2017 to 2018, the assessments of reading and P1 literacy were based on elements of Curriculum for Excellence (CfE), as articulated in the literacy elements of the Benchmarks: Literacy and English, published as a draft in August 2016. It should be noted that for the academic year 2018 to 2019, the final version of the Benchmarks (published in June 2017) is used as the reference point for the assessments.
3.1.3 A note on texts used in SNSA reading assessments
For SNSA reading, a broad definition of texts was used, in line with the statement in Benchmarks: Literacy and English (draft August 2016): ‘Challenge in literacy … involves engaging with a wide range of increasingly complex texts which are suitable to the reading age of each learner.’ In SNSA, this range includes narrative fiction and non-fiction, description, exposition, argument and instructions. A further dimension to the definition of texts in SNSA reading relates to format, as described in Curriculum for Excellence: Literacy and English, Principles and Practice: ‘Texts can be in continuous form, including traditional formal prose, or non-continuous, for example charts and graphs.’
3.2 Coverage of the Curriculum for Excellence: benchmarks and organisers
SNSA are just one part of the range of assessments that teachers use in making their evaluations of children’s and young people’s learning. As a standardised assessment to be completed within a limited time, and using questions capable of being scored automatically, only some parts of the specified reading benchmarks could be addressed. In consultation with Scottish literacy experts, it was agreed that the reading and P1 literacy assessments should be based on the organisers Tools for reading (TFR), Finding and using information (FUI), and Understanding, analysing and evaluating (UAE). Each of the questions selected for inclusion in SNSA reading and literacy assessments for the academic year 2017 to 2018 was aligned with a benchmark statement from one of these organisers.
Although all three organisers are represented in the P1, P4, P7 and S3 reading assessments, there were different proportions across the year groups. In the reports provided to schools, teachers received information about organiser-level capacity if the learner was presented with at least five questions from the organiser. Similarly, in this report, results for organisers that were addressed by at least five questions in the year group’s full set are analysed. The organisers included in the reports are shown by stage, in Table 1.
Table 1: Reporting organisers for reading by stage, academic year 2017 to 2018
|Primary 1|| Tools for reading |
Understanding, analysing and evaluating
|Primary 4||Finding and using information Understanding, analysing and evaluating|
|Primary 7||Finding and using information Understanding, analysing and evaluating|
|Secondary 3||Finding and using information Understanding, analysing and evaluating|
The following sections provide more information on each of the reading organisers in SNSA 2017 to 2018, along with some example items. These items are not used in SNSA 2018 to 2019, and they will not be used in future Scottish National Standardised Assessments.
3.2.1 Tools for reading
In the P1 assessment, this organiser comprised questions related to phonological awareness, word decoding and word recognition; in the assessments for the higher year groups, assessment content mainly referred to learners’ use of strategies to work out the meaning of words. The P4, P7 and S3 assessments contained relatively small numbers of questions from this organiser in the academic year 2017 to 2018.
Figure 10 shows a typical question from the 2017 to 2018 academic year P1 literacy assessment, which reflects the organiser Tools for reading. It is designed to assess children’s knowledge of sounds (phonological awareness). Note that the ‘mouth’ icon indicates to the learner that there is a voiced component to the question, enabling the child to listen to the instruction and response options. In this case, clicking on the icon prompts a reading of the question text to the child, including the answer options.
Figure 10: Example of a P1 Tools for reading question, ‘Rhyming word – feather’
This question asks the child to identify aurally presented rhyming words, a skill which, in English, is a key precursor to mastery of reading. While children might have used decoding skills to read the words ‘weather’, ‘fairy’ and ‘baker’, they are not expected to do so; this is a question designed to assess whether children can hear and recognise rhyming sounds.
Hence, the word ‘feather’ is only presented aurally, with pictorial support. This means that the focus of the question is unambiguously on phonological awareness rather than decoding. This question was classified as of medium difficulty. It was presented to children who were progressing relatively successfully through the assessment, and of these learners, the majority was able to answer it correctly.
Figure 11 shows another example of a P1 Tools for reading question.
Figure 11: Example of a P1 Tools for reading question, ‘Select the word’
This question draws on a child’s word recognition skills, which is part of very early literacy development. It required learners to identify a word from a set of options including numbers and symbols. An audio prompt provided support to the onscreen instruction. In this case, learners were not required to decode the word, or demonstrate understanding of its meaning. This question was answered correctly by almost all the learners presented with it in the assessment.
3.2.2 Finding and using information
This organiser focuses on the critical literacy skills of locating information in a text and employing the information to meet a purpose. These skills are often applied in the context of non-fiction texts but can also be applied to fiction. In SNSA, questions for P1 and P4 learners that corresponded to this organiser generally focused on finding information that was literally stated, or required a low level of inference (for example, recognising synonyms linking the question with the text). More advanced questions addressing similar skills – for P7 and S3 – were likely to be applied to longer and more complex texts. At P7, the organiser Finding and using information also included questions requiring learners to sort information in a text into relevant categories. The S3 reading assessment presented Finding and using information questions that asked young people to find key information in one or more texts, or to make connections between the information they located, sometimes across more than one text.
Figure 12 shows a typical question from the P4 reading assessment for academic year 2017 to 2018, from the organiser Finding and using information. The stimulus for this question is a narrative fiction text of typical length within the context of the P4 reading assessment, and mainly uses relatively simple vocabulary and language structures. The question presented here required children to find information in the fiction text.
Figure 12: Example of a P4 reading text with a Finding and using information question, ‘Cawky Question 1’
This question asks the child to locate a paraphrased detail in the text. The information is located near the beginning of the text, and uses familiar synonymous language as support to help the child link the information in the text to the correct answer. The word ‘screeching’
relates to the answer option ‘sound’. The onomatopoeic word ‘Caw’ is shown to be a sound both through the introductory ‘screeching’ and the use of quotation marks. Finally, ‘Caw’ has a clear link to the name Cawky in the question stem. This question was classified as having low difficulty for P4 learners and was answered correctly by most learners who were presented with it.
A more difficult question from the Cawky unit is presented below, in the section on Understanding, analysing and evaluating.
Figure 13 shows a Finding and using information question from the P7 reading assessment for 2017 to 2018. The stimulus for this question is a descriptive, non-fiction text. While the text is still relatively short, in comparison with the P4 example text in Figure 12, it uses more complex vocabulary, including some technical terminology, and the sentences are longer and use more complex structures, providing children in P7 with greater challenge.
Figure 13: Example of a P7 reading stimulus text with a Finding and using information question, ‘Aphids Question 4’
The question presented here required children to select relevant information from a non- fiction text. Despite the very clear link between the key term ‘migration’ in the question prompt and ‘migrate’ in the text, which draws the learner to the location of the relevant information, in other ways this question is considerably more difficult than the P4 example provided in Figure 1312. It is not only the text that is more complex, but the question itself, which relies on more sophisticated skills and understanding than the simple synonymous matching in ‘Cawky Question 1’. Although this text quite clearly states that aphids only migrate if they find themselves in unfavourable conditions, learners must infer from this that they otherwise stay where they are. Alongside the need to interpret, a second challenge in this item comes from the need to negotiate strongly competing details both within the same sentence and later in the text. The fourth option in particular proved a strong distractor for learners, possibly because the information about enemy attacks was very close to the correct answer in the text, and because of the repeated reference to ‘enemies’ at the end of the text.
This unit was only seen by children if they did relatively well in the first phase of the assessment and the question was classified as having high difficulty. Of the learners presented with this question, a large minority answered it correctly.
A less difficult question from this unit is presented in the discussion of the organiser Understanding, analysing and evaluating which follows.
3.2.3 Understanding, analysing and evaluating
The essence of this organiser is comprehension, beginning with word and sentence level texts (for learners at P1) and with progressively longer and more complex passages of text providing greater challenge across all the reading assessments. While questions for the P4 assessment tended to focus on main or prominent ideas, learners at P7 and S3 were asked to answer a range of literal, inferential and evaluative questions that, for example, might require learners to distinguish between fact and opinion, recognise persuasive language, use evidence from a text to support answers, or evaluate the reliability and credibility of texts.
Figure 14 is an example of a P1 question from the organiser Understanding, analysing and evaluating. It assesses reading comprehension at sentence level. In this kind of question, the child chooses an answer by clicking on a word in the sentence. This skill was modelled in the practice assessment.
Figure 14: Example of a P1 Understanding, analysing and evaluating question, ‘Sentence comprehension’
This item was rated as having high difficulty. While the instruction (‘Read the sentence below … ‘) can be read to the learner using the audio button, the sentence itself has no audio support. In contrast to the Tools for reading question ‘Rhyming word – feather’ presented in Figure 10, the child is required to read independently. As well as being able to decode the words, the child needs to track the pronoun reference ‘she’ and interpret the meaning of ‘but’. This question therefore relies on understanding information, rather than just finding and using it. The majority of P1 children presented with this question was able to complete it successfully.
It can be seen that the example question shown in Figure 14 requires the child both to decode the words (that is, to read independently) and to understand the meaning of the sentence. Another approach to assessing the development of reading comprehension at the earliest stages is to present written texts orally. This is because young children may have higher skills in comprehension than their decoding skills allow them to demonstrate.
Accordingly, at P1, a combination of written texts with audio support and without audio support was used to assess the skills, knowledge and understanding associated with the organiser Understanding, analysing and evaluating.
Figure 15 shows a P4 Understanding, analysing and evaluating question. This was the last of the five questions related to the text Cawky, which is included in Figure 12 above.
Figure 15: Example of a P4 Understanding, analysing and evaluating question, ‘Cawky Question 5’
This question asks children to identify the main idea in a narrative, one of the key skills included in Understanding, analysing and evaluating in the P4 reading assessment. This question, rated as having high difficulty, was the most challenging question asked about the Cawky text. The most commonly chosen incorrect answer was the second option, and this is likely to be because of the explicit references to friendship both at the beginning and end of the passage. A large minority of the learners who saw this question answered it correctly but the majority of those who demonstrated higher overall capacities on the P4 reading assessment responded successfully.
Figure 16 shows a P7 reading question that addresses the organiser Understanding, analysing and evaluating. It was the third question presented to children about the information text Aphids, presented in Figure 13 above.
Figure 16: Example of a P7 Understanding, analysing and evaluating question, ‘Aphids Question 3’
This question asks children to identify the nature of an unusual relationship in a scientific text. To answer this question successfully, children must understand and synthesise information contained across all sentences in the final paragraph of the text, before evaluating the relationship that is suggested. This question was rated as having medium difficulty in the assessment. It was presented to learners who had done relatively well in the first phase of the assessment and was answered correctly by the majority of these learners.
Reflecting the importance of the skills in this organiser within CfE, the majority of questions in the S3 reading assessment for 2017 to 2018 focused on Understanding, analysing and evaluating. Like the reading assessments for P4 and P7, the texts used for the S3 assessment covered a range of text types, contexts and topics, from narrative through to information or persuasive texts, and fiction through to scientific texts or blogs. As would be expected, the texts for S3 were generally longer and more complex than for the lower stages. The text in Figure 17 is an example of a typical text for S3.
Figure 17: Example of an S3 Understanding, analysing and evaluating text, ‘Shill Reviewing’
Shill Reviewing is an example of a non-continuous information text that includes prose, graphical information and a list, and which also includes persuasive language. Although the vocabulary is non-technical and the sentence structures are relatively simple, the different text formats and the overall argument must be integrated by the reader, providing a higher degree of challenge than the texts presented previously in this report.
Figure 18 shows a Shill Reviewing question reflecting the organiser Understanding, analysing and evaluating.
Figure 18: Example of an S3 Understanding, analysing and evaluating question, ‘Shill Reviewing Question 4’
Questions in this organiser at S3 asked young people to demonstrate skills such as interpreting main ideas or details of a text, comparing or contrasting elements within it, or reflecting on its audience or purpose. In Shill Reviewing Question 4, an inference must be drawn by comparing and contrasting the range of views expressed in the text. This question was rated as having high difficulty and a minority of those learners to whom it was presented answered it correctly.
In contrast, almost all learners presented with the question shown in Figure 19 below answered it correctly, including a majority of learners who demonstrated lower capacities in the S3 reading assessment overall. This item was also from the Understanding, analysing and evaluating organiser.
Figure 19: Example of an S3 Understanding, analysing and evaluating question, ‘Shill Reviewing Question 2’
3.3 National results for reading
3.3.1 Overall capacity
Chart 10 shows the overall capacity for reading across all four year groups (P1, P4, P7 and S3) and capacity in relation to the three reading/literacy organisers Tools for Reading (TFR), Understanding and Evaluating (UAE), and Finding and Using Information (FUI). Regions show high, medium and low capacity, in line with SNSA reports for the 2017 to 2018 academic year, which are specific to each year group.
Chart 10: Reading capacity by SNSA year
At P1, nearly half of the children showed high capacity, with most of the others showing medium capacity. Very few children showed low capacity. A majority of children in P1 showed high capacity with regard to the organiser Tools for reading, while for Understanding, analysing and evaluating, the highest proportion (somewhat less than half) of children showed medium capacity. Across the two organisers, few children were found in the low capacity groups.
At P4, a majority of the learners demonstrated high capacity, while lower proportions showed low and medium capacity. For the two organisers (Understanding, analysing and evaluating and Finding and using information), the majority of children demonstrated high capacity, with broadly similar proportions in the medium and low capacity categories.
The majority of learners at P7 demonstrated high capacity on their reading assessment, while only relatively small proportions demonstrated low capacity. There was a slightly larger proportion of learners with high capacity for the Understanding, analysing and evaluating organiser compared with proportions for Finding and using information.
At S3, overall, about half of the learners demonstrated high capacity. There was a somewhat lower proportion in the high capacity region when considering only the organiser Finding and using information, compared to the proportion of learners with high capacity in relation to the organiser Understanding, analysing and evaluating.
Evidence from educational research suggests that learning growth differs according to the stage of schooling. Bearing in mind that individuals experience growth spurts and plateaus in different ways, research indicates that, on average, younger learners tend to advance in their learning more rapidly than older ones, where progress typically continues, but at a slower pace. The two SNSA norming studies carried out in Scotland in the academic year 2017 to 2018, in the first and second halves of the academic year, confirmed that these findings also hold for reading and literacy acquisition among Scottish learners.
Chart 11: Reading capacity across norming study time periods
Chart 11 shows that between November and March, in all four year groups, there were increases in the proportions of learners who showed high capacity. This increase was most prominent among learners at P1, with regard to their literacy development, while among learners at P4 there were relatively smaller but still noticeable increases in the proportion of learners with high capacity. These increases between the first and second norming study in high capacity learners were somewhat less marked at P7 and S3.
Chart 12 shows the proportions of literacy and reading capacity among boys and girls for all SNSA year groups, overall and when considering each of the two organisers.
Chart 12: Reading capacity distribution by gender and SNSA year
For all year groups, capacity among boys and girls was similar in that the largest proportions of learners were found in the high capacity category, both overall and when considering each of the organisers separately. However, there were differences worth noting between the two gender groups, with consistently larger proportions of girls than boys demonstrating high capacity. At P1, while for both organisers there were somewhat larger proportions of girls in the high capacity region than boys, these differences were smaller when considering the organiser Understanding, analysing and evaluating.
At P4, a notably larger proportion of boys than girls demonstrated low overall capacity and a parallel difference was evident in the high capacity region, where the proportion of girls was larger than boys. These results were also observed when considering separately the two organisers Finding and using information and Understanding, analysing and evaluating.
At P7, larger proportions of girls than boys demonstrated high capacity, although the differences were somewhat smaller than among learners at P4. This finding holds overall and when considering the two organisers separately.
At S3, there was also a notably larger proportion of boys than girls demonstrating low capacity overall, while larger proportions of girls than boys demonstrated a high capacity. This gender difference was more marked with regard to the organiser Understanding, analysing and evaluating. However, the proportions of girls and boys demonstrating high capacity were similar with regard to the organiser Finding and using information.
3.3.3 Scottish Index of Multiple Deprivation
Chart 13 shows the distribution of learners for all SNSA year groups across categories reflecting the Scottish Index of Multiple Deprivation (SIMD). To simplify the display of results and aid their interpretation, we used three categories to indicate levels of socioeconomic background, namely: 1–4, indicating the bottom socioeconomic quintile (that is, the most deprived children and young people, those in vigintiles 1 to 4); 5–16, indicating the three middle quintiles (that is, those in vigintiles 5 to 16); and 17–20, indicating the top quintile (that is, the least deprived children and young people, those in vigintiles 17 to 20).
Chart 13: Reading capacity distribution by SIMD and SNSA year
At each year group, it can be seen that the proportions of learners in the group with high capacity was much larger in the SIMD category which reflects a higher socioeconomic status (i.e. less deprivation), while the relatively higher proportions of learners with low capacity were found in the SIMD category reflecting lower socioeconomic status. This pattern was present in all four year groups, overall and when considering only assessment items corresponding to each of the two organisers at each year group.
Chart 13 illustrates that the difference in literacy and reading capacity between children from the bottom quintile and the top quintile of SIMD was relatively small at P1, while it was more substantial at P4 and P7, and greatest at S3. These observations apply both to overall capacity and to each of the reading organisers.
At P1, more children in the bottom quintile than in the top quintile demonstrated capacity in the low region, while larger numbers of children in the top quintile showed high capacity. These differences were somewhat larger when considering only assessment items pertaining to the organiser Tools for reading.
At P4 and P7, a similar picture emerged but there was a more marked difference in the distribution of results than for P1 between children in the bottom and top quintiles. More children from the bottom quintile showed capacity corresponding to the low capacity region, whereas larger numbers of children in the top quintile demonstrated a high capacity. Generally, less than half of learners from the bottom SIMD quintile showed high capacity in P4, compared with majorities among the learners in the top SIMD quintile. Both among P4 and P7 learners, the outcomes were in broadly similar proportions when considering assessment items related to each of the two organisers, Finding and using information and Understanding, analysing and evaluating.
At S3, there was also a relatively large difference in the distribution of results when comparing learners within the bottom and top quintiles of SIMD. There were notably higher proportions of learners from the bottom quintile with low capacity, and larger proportions of learners in the top quintile attaining a high capacity. The differences in the proportions of learners with high capacity among learners in the bottom and top quintiles of SIMD were somewhat larger for the organiser Understanding, analysing and evaluating than for the organiser Finding and using information.
3.3.4 Ethnic background
This section looks at differences in reading capacity between learners with ‘White Scottish’ and other ethnic backgrounds. Chart 14 shows the reading/literacy results for these two groups.
Chart 14: Reading capacity distribution by Ethnic background and SNSA year
The results show that generally there were minor differences across the two comparison groups, both in terms of overall capacity and when considering results for each of the two organisers. The very small differences in proportions of children with high capacity across the two groups tended to be in favour of children from ‘White Scottish’ backgrounds at P1, and in favour of other ethnic backgrounds at P4, P7 and S3.
3.3.5 Free School Meal Registered
Chart 15 shows the reading/literacy capacity of learners according to groups defined by registration for free school meals. This chart distinguishes those with registered entitlement from all other learners.
Chart 15: Reading capacity distribution by Free School Meal Registered and SNSA year
At each of the four year groups, there were noticeably larger proportions in the high capacity group among learners without entitlement than among those with FSE, and correspondingly higher proportions in the low capacity group among learners with FSE. This pattern was similar across all year groups and also when considering results by organisers.
At P1, overall, about half of the children without FSE showed high capacity, while a minority of those with entitlement demonstrated capacity in this category. Differences were also similar when considering each of the two organisers Tools for reading and Understanding, analysing and evaluating.
At P4 and P7, even larger differences between the two groups were observed, with similar differences in the proportions at high and low capacity when reviewing results by organisers.
Among learners at S3, the differences between those at high capacity, both with and without FSE, were somewhat smaller (but still notable) when considering only the organiser Finding and using information.
3.3.6 Additional Support Needs
Chart 16 shows the proportions of learners with high, medium and low reading/literacy capacity across SNSA year groups, according to whether or not learners were identified as having Additional Support Needs (ASN).
Chart 16: Reading capacity distribution by ASN and SNSA year
Across all year groups, it can be seen that the proportions with higher capacity were notably larger among learners with no ASN, and similarly, there were larger proportions reflecting low capacity among learners with ASN. This pattern was present in all year groups.
At P1, differences between the two groups were observable, but not as large as for other years. While, overall, about half of the learners without ASN showed high capacity, a minority of those with ASN fell into this category. The differences between the proportions with high capacity among the two groups were somewhat smaller with regard to the organiser Understanding, analysing and evaluating.
At P4 and P7, the majority of learners without ASN demonstrated high capacity, while only half or less than half of the learners with ASN had results corresponding to the high capacity range. This was the case with regard to their overall capacity and also when considering the two organisers separately. At S3, similar differences were found, although for the organiser Finding and using information the differences between groups were slightly smaller.
3.3.7 Looked After Children at Home and Looked After Children Away from Home
Chart 17 shows the proportions of learners with high, medium and low capacity in reading/ literacy by categories of Looked After Children at Home (LAH) and Looked After Children Away from Home (LAA), in comparison with other learners, as classified within SEEMiS.
Chart 17: Reading capacity distribution by LAH/LAA and SNSA year
Across all four year groups, it can be seen that among those learners who are registered as LAH and LAA, there were notably lower proportions demonstrating high capacity, while among other learners there were larger proportions in this category. Similarly, among LAH and LAA learners, there were also higher proportions with low capacity, when compared with other learners. This pattern was similar across all four year groups.
At P1, differences among the two comparison groups were less pronounced than at higher year groups (P4, P7 and S3).
3.3.8 English as an Additional Language
Chart 18 shows the reading/literacy capacity of learners according to groups defined by language background: English as an Additional Language (EAL) and all other children and young people. The category ‘Yes’ refers to those learners whose record in SEEMiS, the national database, showed that they had English as an additional language. The ‘Other’ category comprises both learners for whom there was a ‘No’ as the entry for EAL, and those for whom there was no entry in this field.
Chart 18: Reading capacity distribution by English as an Additional Language (EAL)
When comparing proportions of high, medium or low capacity between groups of learners for whom English is an additional language (EAL) and those with English as their first language, we observe in Chart 18 relatively small but notable differences in favour of learners in the latter group, both in terms of overall capacity and when considering individual organisers. Similar patterns were observed across all four year groups.
At P1, differences between the two groups were slightly larger than in higher years (P4, P7 and S3). When comparing proportions separately for the two organisers, similar differences between the two groups were observed, with learners with English as their first language showing somewhat larger proportions with high capacity.