In June 2021, the Organisation for Economic Co-operation and Development (OECD) published an independent review into Scotland’s school curriculum. This outlined 12 recommendations, which the Education Secretary announced would be accepted in full, and resulted in a programme of educational reform. Professor Ken Muir CBE was appointed as Independent Advisor on Education Reform, in order to provide independent and impartial advice around some of the proposed changes, with his report (and the supporting consultation and survey analysis reports) published in March 2022.
One area highlighted for reform was the inspection requirements on the funded Early Learning and Childcare (ELC) sector. This sector is currently subject to inspections by both the Care Inspectorate and His Majesty’s Inspectorate of Education (HMIE). The Care Inspectorate has powers to inspect all ELC (both funded and unfunded) and regulated school age childcare (SAC) settings. Meanwhile, HMIE has a role in inspecting settings that provide funded ELC. Therefore, the funded ELC sector is currently inspected under two separate frameworks:
- HMIE inspects services against its quality framework: ‘How good is our early learning and childcare?’ (HGIOELC?)
- The Care Inspectorate inspects services against its ‘Quality Framework for Day Care of Children, Childminding and School-Aged Childcare Services’ implemented on 1 June 2022.
In addition, local authorities have a role to act as the guarantors of quality and are responsible for assessing and monitoring compliance with the National Standard for all funded ELC providers. In order to fulfil this role some undertake their own quality assurance visits.
Professor Muir’s consultation work identified extensive criticism of the current system of inspection, where the ELC sector was perceived as being disproportionately subject to external accountability, more so than other parts of the education system, and where there was possible duplication of the roles between the inspectorates involved. He also suggested that there was strong support for a shared framework being developed in the ELC sector as a means of reducing confusion, bureaucracy and workload. As a result, the Muir report recommended that a new education inspectorate body be established (as a result of removing the inspection function from Education Scotland) which should re-engage with the Care Inspectorate to agree a shared inspection framework for early years provision.
The Scottish Government indicated a preference for the focus to be on Professor Muir’s recommendation regarding the development of a shared framework, rather than pursuing the option to create a single inspection body for the ELC sector (as had been suggested by some previous consultation respondents). The Scottish Government argued that the development of shared inspection framework would not require legislative or structural changes, could be implemented more quickly, and would retain the key expertise and vital functions that each body delivers.
In order to support the development of a shared framework for inspections for the ELC sector, the Scottish Government conducted a public consultation. This was developed in partnership with stakeholders, including the relevant inspectorates. The consultation asked a total of 32 questions, including 13 closed questions and 19 open questions. Feedback was sought on the vision and guiding principles for the framework, the current inspection landscape, and on proposals for a shared quality framework. The consultation ran for 16-weeks, from 11 July to 28 October 2022, and was open to anyone who wished to take part. A range of feedback methodologies was used, including inviting written submissions via Citizen Space (the Scottish Government’s online consultation portal), emails and postal submissions, as well as online events which allowed attendees to provide written feedback on proposals during the sessions.
In total, 255 responses were received to the main written consultation. However, this included one duplicate response which was removed prior to analysis, meaning that 254 substantive responses were received in total. Of these, most were submitted via the Citizen Space portal (n=245), while nine respondents submitted non-standard responses via email. In most cases the non-standard responses focused on the consultation questions and so the data were merged with the main Citizen Space data for analysis purposes. Only two responses took a more general approach and so required to be analysed separately.
Of the 254 responses provided, 60% (n=152) were provided by individuals compared to 40% (n=102) from organisations. It should be noted, however, that a few organisational responses had been informed by surveys or other input from their members/staff. Indeed, one such response outlined survey findings from over 1,200 childminders. Therefore the total number of people who contributed to the consultation is significantly higher than outlined above.
Respondents were also asked to outline the sector they worked in or the nature of their interest in the topic. Categories were designed to be consistent between individuals and organisations, with comparative analysis carried out at the sector level as views tended to reflect respondents experiences in this respect rather than their individual/organisation status. The numbers in each category are outlined in Appendix A. Most responses came from providers and individuals (with both typologies having 103 respondents each and representing 41% each of the total number of responses). It should be noted, however, that the content of responses from individuals suggested this group also consisted largely of ELC providers.
Those responding on behalf of an ELC setting were asked if they provided funded ELC, with 36% (n=90) indicating that they did. They were also asked if they received inspections from both the Care Inspectorate and HMIE, with 32% (n=81) stating that they did.
Respondents were also asked if they were a parent of a child attending ELC or SAC. Nine percent (n=23) indicated that they had a child attending an ELC setting, while 15% (n=39) had a child attending SAC.
A series of online events (n=11) were also conducted by the Scottish Government, where feedback was sought largely via written comments provided via the chat facility and/or using specialist software (Mentemeter and Slido). Although 13 events were conducted in total, feedback was not gathered at two of these. In these cases, the host organisations used the event to inform their formal written response submitted via Citizen Space. As such, event data detailed in this report is restricted to the 11 events which resulted in feedback/comments being provided during the sessions. The events tended to follow a consistent format, with up to eight questions asked which related directly to those contained within the main consultation document. Appendix B provides details of the individual events.
Across the 11 events where feedback was provided during the session, over 380 people attended, however, the number of attendees was not recorded at one of the events, and only a rough indication could be provided for the others due to some attendees joining late and others leaving the discussion early. As such, not all attendees were necessarily present at the same time or contributed to all questions. The events ranged in size from roughly nine attendees, to roughly 83 attendees in any one event.
The analysis of consultation responses and the reporting of the findings was carried out independently by Wellside Research Ltd, a research company contracted by the Scottish Government through a fair and transparent competitive tender process.
All responses were logged into a database and screened to identify any campaign, blank, duplicate or non-valid responses (i.e. where responses were not relevant to the current consultation). No duplicates or non-valid responses were identified and only one blank response was screened out. Feedback was then analysed, to be presented under the appropriate sections below.
Closed question responses were quantified and the number of respondents who selected each response option is reported below. Both the raw percentage and the valid percentage are shown (i.e. the percentage of people who responded to each option once the non-respondents had been removed).
Qualitative comments given at each question were read in their entirety and manually examined to identify the range of themes and issues discussed. Analysis was also conducted to identify any differences in views between respondent groups (i.e. between individuals and organisations, organisational sectors, roles, and the different educational stages represented). Recurring themes that emerged throughout the consultation were recorded, and verbatim quotes were extracted in some cases to illustrate findings. Only extracts where the respondent consented for their response to be published were used.
Reporting Conventions and Research Caveats
Findings are generally presented as they relate to each question in the consultation, although the reporting of some questions has been combined where the intent of the questions were similar and elicited significant overlap in responses. As well as providing an overall summary of the common views at each question, the report also highlights where views differed by respondent typology/sector.
It should be noted that many respondents preferred not to identify their sector, or reason for interest in the consultation (and indeed were not required to do so in the events). Therefore there was a large number of ‘individuals’ and unknown affiliations for those at the public events, thus complicating the sector based analysis. Based on the content of the responses however, this group appears to have been made up of practitioners, managers and owners from both the ELC and SAC sectors, parents, those working for the inspectorates, etc. It should be noted, however, that there was strong agreement between different sectors and views tended to be replicated across a wide range of respondent types. As such, the lack of sectoral attribution for some respondents did not negatively impact the findings.
Some respondents opted not to answer closed questions but did offer open-ended responses to the same question, meaning that there was not always a direct correlation between the number of people who supported/did not support a particular statement and the number of people who gave a qualifying comment. For fullness, all responses were included in the analysis, even where the closed component of the question had not been answered.
While respondents referred to both HMIE and Education Scotland interchangeably (and also occasionally to Scottish Government inspections), the term HMIE is used throughout this report for consistency and to avoid any confusion. Any references to Education Scotland within this report refer to the wider functions of the organisation, while all points related to the inspection function is attributed to HMIE.
Further, although references are made throughout to HMIE, which currently sits within Education Scotland, it is acknowledged that this organisational structure may change as a result of the educational reforms. As the future structure and naming conventions relevant to this are not yet known, any references in this report to HMIE’s role in future inspections should be read as being equally applicable to any new organisation which replaces HMIE or any new organisational structure which will house HMIE going forward.
A thematic analysis approach was taken for all qualitative data submitted, rather than attempting to quantify and attribute open-ended data to codes. As such, no fixed number of responses is provided in relation to the themes and issues discussed, however, an indication of the strength of feelings expressed is generally provided.
As the questions posed at the events were largely consistent with those in the main Citizen Space consultation document, the analysis of both elements has been combined in the following chapters (where relevant). Where there were differences in views, or where issues were more prevalent in the event comments, this has been identified in the narrative. Again, however, there was substantial consistency in views and experiences expressed across all response methods.
It should be noted that respondents were able to participate in the consultation in multiple ways, i.e. submitting a written response and attending and commenting via an event, or attending more than one event, and indeed, there was some limited evidence of the former. For example, a small number of the comments submitted at individual events were very similar in nature and wording to submissions via the Citizen Space portal. In addition, it is not possible to know whether other respondents may have changed their views between attending an event and submitting a citizen space response (or vice versa). To ensure completeness of the analysis, all input has been considered and included here, but this potential duplication or updating of views should be borne in mind when considering the results.
Finally, the findings here reflect only the views of those who chose to respond to this consultation. It should be noted that respondents to a consultation are a self-selecting group. The findings should not, therefore, be considered as representative of the views of the wider population.
There is a problem
Thanks for your feedback