Improving outcomes for children, young people and families: review of Children’s Services Plans and strategic engagement activity

Summary review of children’s services plans for 2020 to 2023, in line with the Children and Young Peoples (Scotland) Act 2014, statutory guidance part 3. Highlighting key strengths, areas for development and details from strategic engagement with local children’s services planning strategic leads.


10. Use of Data and Evidence (Criteria 3, 4 and 13)

This section of the report discusses the use of data and evidence[17] in the development and evaluation of Children's Services Plans.

Use of Data in Strategic Planning (Criteria 3 and 4)

Criterion 3 considers whether the CSPP have developed their Plan through a robust evidence-based joint strategic needs assessment (JSNA) of the population of children, young people and families in the local area. Criterion 4 refers to whether each CSP has included analysis of quantitative and qualitative evidence and data, relating to both service performance and child wellbeing.

An overview of key statistics regarding the population of children, young people and families was helpful in setting out the local context and needs, highlighting achievements from delivery of the 2017-2020 Children's Services Plan, and any new or changing needs which had emerged. A key aim of using data and evidence from the JSNA and other sources, is to identify emergent needs and areas for improvement, to inform the choice of specific strategic priorities and actions through a clearly evidenced rationale.

18 out of 30 CSPs included a JSNA. Eight CSPs could have been improved through provision of additional information on the local needs of children and families, and four CSPs did not make any reference to an assessment at all.

The following were identified as examples of best practice in relation to Criterion 3:

  • A section of the Plan which clearly presented a JSNA and described how this was conducted, together with data and evidence on the local area, population of children, young people and families, key demographics, and the current delivery context
  • Quantitative and qualitative data from several sources, such as surveys, consultations, focus groups, engagement events
  • A short summary of key findings from the JSNA
  • Data and evidence on groups of children and young people with specific needs (for example Gypsy/Travellers, care experience, complex health needs or disability, young carers)
  • Clear links between the data included, achievements/milestones of progress, identified areas for improvement activity, and to inform identification of the strategic priorities for the Plan
  • Presentation of data and key findings in a user-friendly way – graphs and tables where appropriate, and supported comparisons between local and national averages, or changes in indicators of wellbeing over time, to show whether each area was showing improved outcomes for children and families, or suggested further improvement was required
  • Explanations of data trends, indicators used, and data sources.

Focusing on Criterion 4 on evidence and data, Part 3 of the Guidance asks Children's Services Planning Partnerships to include a section with data relating to service performance and child wellbeing, including evidence of progress being made against national and local objectives to improve outcomes for all children and young people. This also includes a summary review of the previous Children's Services Plan (2017-2020) and findings from self-evaluation, audit and inspection on service performance. The aim of drawing on collaborative use of data and evidence is to enable the CSPP to demonstrate improved outcomes and successes, alongside identifying areas where further action and/or development was needed.

More than half the CSPs did not provide enough information to fully or partially satisfy this Criterion, with only 13 CSPs meeting it in full. 12 CSPs needed to strengthen their use of data by providing additional detail, and five Plans did not provide robust data and evidence.

CSPs that met this Criterion well, had included a section with data on key areas of wellbeing, such as education, early years, health, additional support needs etc. and used this as a basis for understanding local needs and to review provision of services and supports to children and families over the period of the Plan.

An example of good practice was provision of clear information on what services were being offered over the period of the CSP to meet identified local needs, and their impact on outcomes using statistics, quotes from service providers and from children, young people and families on their lived experiences. Another example of best practice was discussion of each strategic priority with presentation of local data and evidence to show why this was important, and what it aimed to achieve. Some CSPs included a summarised review progress made in delivery of their 2017-2020 Plan which highlighted their achievements and areas identified for further improvement.

Robust Plans presented their data in a clear and user-friendly way, with specific timeframes, data sources, graphs and tables where appropriate, explanations of data trends and indicators used, and benchmarking comparisons across national and local performance and for the CSPP area over time. Another example of best practice, noted in three CSPs, was to link data and evidence with aspects of wellbeing (across SHANARRI) and with relevant children's rights.

Those Children's Services Plans that would have been strengthened through inclusion of additional information (12 out of 30) would have been improved by providing detail of the following:

  • A summary review of the 2017-2020 Children's Services Plan (eight CSPs did not include this)
  • Evidence of service performance (eight CSPs missed this information)
  • Clearer links between local data and evidence, and the selection of strategic priorities (nine CSPs did not include this)
  • More quantitative and qualitative data to identify emergent needs of children and families, and identify areas for improvement.

Good Practice Examples

East Ayrshire: The Plan includes a section on the successes of the 2017-2020 CSP presented with the use of data and evidence, including statistics and testimonials from young people, as well as active links to case study examples of improvement work that was undertaken through 2017-2020. The section is clear and user-friendly. Another section presents findings from engagement events and data in order to identify the main challenges which led to the development of the CSP.

Orkney: Orkney’s CSP showed great use of data and evidence-based strategic priorities. The Plan used local data to explain the rationale for each priority and services to be delivered, highlighting the areas where Orkney has made good progress in improving outcomes, as well as the areas that needed further development. Data and evidence from engagement events and consultation with young people were included, linked to wellbeing across SHANARRI, and a set of measurable indicators is in place to monitor progress of priorities and actions. Data had been presented in a clear and user-friendly way and showed very clearly a direct link to the services and actions contained in the Plan.

Stirling: Stirling’s CSP showed excellent use of data and evidence, which were discussed in several sections of the Plan. A multi-agency working group is responsible for the joint strategic needs assessment, to provide data which helps the CSPP prioritise action. Stirling’s Plan included a section with key findings from the JSNA, a section evaluating progress made over the period of the 2017- 2020 Plan (using outcome measures), and two sections which summarised findings from local engagement activity (focus groups, interviews, lived experiences) with children, young people, families, and professionals. It also presented data from local surveys, alongside data on children’s wellbeing, and service performance information.

The Plan included statistics on Stirling’s population and specific figures on outcomes for children in poverty, in need of protection, with care experience, affected by disabilities, and in need of support with mental health and mental wellbeing. Where there was no local data available, Stirling drew on national data to identify groups of children and young people with poorer outcomes and considered this within the local context. An online appendix has been attached providing further details on how the CSPP had gathered data and evidence through the JSNA. This detailed evaluation of service impacts and engagement with professionals, children and families showed how the evidence had informed the development of Stirling’s Plan. The appendix included a very clear table which showed outcome indicators from the 2017-2020 Plan, and highlighted where progress had been made, and where it had not. This concise and clear appendix made Stirling’s Plan very robust through being explicit about its evidence base.

Use of Data in Monitoring Progress of Plans (Criterion 13)

Part 3 of the Guidance indicates that clear indicators of progress should be detailed to support the CSPP to monitor and evaluate the effectiveness of any action and service delivery contained within the Children's Services Plan.

Just over half the Plans fully met this Criterion (17 out of 30). One CSP had some measures of progress, but needed to build on this by including further detail, and 12 CSPs did not contain clear progress indicators or performance measures.

A robust CSP should include a set of measurable indicators which enables the CSPP to monitor the progress of each strategic priority and any aligned actions. This should also provide information on how the CSPP plans to monitor and evaluate the Plan overall, including data on the performance and impact of children's and adult services.

Among the tools used to monitor and evaluate services mentioned in the CSPs reviewed were:

  • (Joint) Self-evaluation activity
  • Quality Assurance Framework and internal audit
  • Quality Improvement Methodology (QI) utilising support from the Children and Young People's Improvement Collaborative (CYPIC)
  • External audit, scrutiny, and service/ thematic inspections
  • Annual reporting/ Periodic performance monitoring
  • Qualitative and Quantitative data and evidence, including feedback from children, young people and families and service providers, analysis of administrative data, surveys and consultations
  • Logic models
  • Scottish Government's Three Step Improvement Framework for Public Services.

18 out of 30 CSPs included a set of measurable indicators to monitor the progress of their strategic priorities. Best practice included:

  • Tables breaking each strategic priority down into concrete aims, actions, with one or more measurable performance indicators
  • Measurable indicators aligned with aspects of wellbeing across SHANARRI
  • Baseline, current performance, and target percentage or intended change for each measurable indicator
  • Inclusion of data source(s) for each indicator.

Good Practice Examples

Edinburgh: Edinburgh’s Plan included a clear section on measuring success, describing ongoing collection of feedback from children, young people and their families, periodic reporting to the CSPP from delivery groups, and a set of population measures relating to the CSP’s aims. This presented measurable indicators of progress for each of the 3 aims of Edinburgh’s Plan, including the current baseline percentage. Nine indicators were used in total for the high level aims, which is a manageable set of indicators.

East Renfrewshire: East Renfrewshire’s Plan included measurable indicators (31) to monitor different elements of the Plan’s success and impact. These measures were listed under each of the Plan’s priorities, with clear detail of how East Renfrewshire CSPP will also measure progress of these via analysis of local improvement and evaluation activity, as well as feedback from children, young people and their families. The Plan also included two dedicated sections providing information on local evaluation of implementation activity, and measuring success.

Fife: Fife’s Plan included a list of measurable indicators, which were presented in a clear and robust way. Each indicator is attached to the current Fife performance, benchmark and improvement goal (all in percentages). This way of presenting the indicators reflects an area of best practice. The list included 8 indicators, which is a manageable number. The Plan described how other methods of evaluation will also be used to supplement this, such as the use of a quality improvement approach, feedback from children, young people, families and staff, the Scottish Government’s Three Step Improvement Framework, and the 4DX Methodology.

Contact

Email: C&F.StrategicEngagement@gov.scot

Back to top