7. Monitoring, evaluation and reporting
This chapter presents the analysis of responses to questions fourteen and fifteen, which focus on monitoring, evaluation and reporting.
- Respondents welcomed the approach to evaluation described in the draft strategy. They supported the commitment to effective, holistic evaluation and reflected on the need for a range of qualitative and quantitative methods to be adopted.
- Many described the importance of involving stakeholders, including the public, in the monitoring and evaluation processes. This included providing input as evaluation participants, and in informing design, gathering data and as agents of change.
- A need for a variety of frequencies of reporting was mentioned by many, to enable comprehensive reporting of progress and achievement at fixed points, supplemented with more frequent and regular updates and communication, as required. Annual reporting was most suggested, with the expectation of full or summary reports on progress and achievements against objectives, as well as targets for the year ahead.
- Several respondents felt a range of reporting formats were required, primarily to engage different audiences and stakeholders. It was felt that reporting should be inclusive, accessible and engaging for all stakeholders in terms of language, presentation and format.
- Respondents suggested a range of reporting mediums including the use of visuals, infographics, and images and using stories, case studies, and quotes. The approach used to share information and updates throughout the COVID-19 pandemic was frequently mentioned as an effective example.
Q14. What are your views on how our progress towards our objectives could be most effectively monitored and evaluated?
The majority of respondents (126/178) answered question fourteen.
Support for monitoring and evaluation activity
Several respondents welcomed the commitment to effective, holistic evaluation described in the draft strategy. They stressed the importance of evaluation in identifying effective approaches, informing learning and change, and evidencing achievements and outcomes.
Approach and methods
Discussion of methods and tools to support evaluation activity was the most common theme in responses to this question. Respondents reflected on the need for a range of qualitative and quantitative methods to be adopted.
Evaluation models, and approaches to quantitative and qualitative data collection, were described. Examples of evaluation models or frameworks included: process and outcome evaluation, appreciative enquiry, participatory appraisal, transtheoretical behaviour change model, Social Multi-Criteria Evaluation and outcome mapping.
Several respondents detailed a range of data collection techniques that could be adopted, including surveys, focus groups, polling, case study research, capturing stories and quotes, the use of interactive online platforms and social media.
Other considerations raised by respondents included the need for monitoring and evaluation activity to be conducted at a local level, with place-based methods or community-led work to complement overarching evaluation activity.
Use of existing monitoring and evaluation systems, approaches and tools
Several respondents suggested using existing methods, approaches and tools, with examples from different sectors highlighted. The Scottish Household Survey (SHS) was most commonly cited, with two references to the Scottish House Condition Survey and one mention of the Scottish Social Attitudes Survey.
Half of those advocating for using existing tools acknowledged that questions would need to be extended or amended to align with objectives and supplemented with other qualitative and quantitative data. One respondent described an absence of young people's views and attitudes in monitoring and evaluation approaches set out in the strategy, noting that SHS only reports on the adult population.
Some respondents shared examples of existing frameworks, tools and approaches that can be learned from or have been successfully utilised across different sectors or organisations. This included a framework for National Standards for Community Engagement, the Active Travel Framework, a net zero transition ladder, the Place Standard Tool, Ipsos Mori's Young People in Scotland survey and online dashboards, such as those that used to present data about COVID-19.
A few respondents suggested aligning with the National Performance Framework to track progress and one suggested the monitoring approach should be in line with the second Scottish Climate Change Adaptation Programme, based on ClimateXChange research. Two respondents referred to the existing theory of change, commenting that it should form the basis of monitoring and evaluation activity. One respondent suggested the use of a segmentation model that recognises the distribution of climate change attitudes across different groups, such as the Climate Outreach and More In Common model.
Indicators and measures
Many respondents reflected on the measures and indicators that could be used to demonstrate progress and achievement, describing examples and proposing some new approaches. It is not possible to cover the full scale and variety of measures suggested; instead, they are described in broad categories with illustrative examples:
- Environmental (traffic reduction, atmospheric CO2 levels).
- Health and wellbeing (bike journeys taken, personal debt).
- Behavioural and attitudinal (toward climate policy, adoption of no car policy),
- Cost/cost saving (reduction in power costs, cost savings of home insulation).
Other measures suggested by several respondents included measuring reach, changes in involvement amongst the public, public awareness, support for action on climate change, the capacity for action and child rights indicators.
Several respondents indicated their support for the proposed national indicators, although one suggested that the indicators provided in Low Carbon Scotland: Behaviours Framework does not enable the measurement of progress by businesses, communities or organisations.
Goals and targets
Some respondents emphasised the need for clear and measurable goals/targets to enable effective monitoring and evaluation, with the SMART model of goal-setting mentioned explicitly by a small number. In the discussion of the SMART model, a small number of respondents described the need for each goal/objective to have its suite of indicators to measure and understand progress towards achievement. One respondent reflected that the proposed objectives are not measurable, and another called for greater clarity on the actions being taken and what those actions achieve.
Other suggested measurements identified in single responses included: new, smaller goals and aims that demonstrate shorter-term progress; and for the objectives to focus less on public opinion and more on the efficacy of policy.
Credibility and robust approaches
Several respondents highlighted a need for monitoring and evaluation activity to be rigorous, credible and robust. Suggestions to achieve this included the use of independent organisations or groups such as the Intergovernmental Panel on Climate Change (IPCC), UN, NGOs and research universities to support or undertake monitoring and evaluation activity, to be evidence and/or science-led, and a willingness to be open about success and what has not worked nor been achieved. Calls for transparency about progress emerged more generally across responses to various consultation questions.
Several respondents called for some form of steering or implementation group to oversee the strategy and associated monitoring and evaluation activity, to provide expertise, advice and accountability. They suggested this group involve representatives from a range of civil society organisations who have a part to play in delivering the strategy's objectives, as well as those with lived experience and representatives of the public and private sector who can advise. A small number reflected on the need to establish a robust baseline position so that change can be effectively measured and communicated.
Involving the public and wider stakeholders
Many respondents described the importance of involving stakeholders, including the public, in monitoring and evaluation processes. Comments in this discussion included input as evaluation participants, informing design, gathering data and agents of change.
Other reflections on the value of stakeholder involvement included providing expertise and informing required adaptations or changes needed to achieve the objectives or define new indicators of progress. These comments linked with discussion around the need for a stakeholder steering or implementation group with broad representation. One respondent advocated for the creation of a reference panel with diverse representation, including communities not currently engaged in conversations around climate action and people experiencing poverty and/or with protected characteristics.
Some respondents also called for clear and regular communication with stakeholders, particularly the public, to maintain engagement, increase awareness and to action/change. On this theme, a small number urged for ongoing and regular stakeholder consultation.
Less commonly reported themes
A small number of respondents called for regular periods of reflection and review to allow adjustment as required and respond to emerging evidence and learning. One respondent suggested that a patient approach was needed, highlighting that progress can be slow.
The value of comparator and benchmarking measures and data was highlighted by a small number of respondents. This included consistent approaches or a common standard to enable comparative analysis and benchmarking between countries and international partners. Two respondents detailed the value of integrating a system to exchange and share knowledge so that this can be applied more widely.
A small number highlighted the benefit and importance of links between or alignment with other strategies and agendas. Examples included the Environment Strategy, the Climate Change Plan and Circular Economy metrics. One respondent felt there was a role for the Digital Planning Strategy to support the collection and analysis of spatial data.
Finally, one respondent expressed a view that evaluation should not be a priority at the moment.
Q15. How regularly – and in what format – should we report on progress on the strategy?
The majority of respondents (119/178) answered question fifteen.
Many respondents cited a need for a variety of frequencies of reporting, to enable comprehensive reporting of progress and achievement at fixed points, supplemented with more frequent and regular updates and communication, as required. For example, this could include a full and detailed annual report covering progress and achievement across all aspects of the strategy, with shorter monthly or quarterly updates that focussed on specific activities, projects or successes.
Annual reporting was most commonly suggested, with the expectation of either full or summary reports on progress and achievements against objectives, as well as targets for the year ahead. In comments, respondents reflected that annual reports are a standard reporting frequency for most public bodies and allow for sufficient progress between reports. However, a few respondents called for less frequent reporting, ranging from every two to every five years (aligned to Scotland's Climate Change Plans and Scotland's Climate Change Adaptation Programme' review), though a small number also cited the need for shorter reporting intervals to supplement this.
The second most common theme was calls for quarterly reporting, with smaller numbers suggesting monthly or bi-monthly reporting frequency. Only a small number of respondents provided a rationale for this higher level of frequency. Those that did cite more frequent reporting as demonstrating a commitment, enabling success stories to be highlighted, and providing scope to vary the focus of reporting and provide updates on notable projects and provide inspiration. One respondent suggested weekly reporting of air quality around schools on main traffic through-roads, another proposed a knowledge exchange that could be updated almost daily.
Third most common were calls for bi-annual reporting, although few provided a rationale for this. One suggested that a bi-annual report could show what work is currently being carried out and indicate whether progress is on track against anticipated full-year targets. Two respondents mentioned this approach would allow a swift response if expected progress was not being met. Another stated that reporting every 6-months would give a reasonable period of time for progress against objectives to be measured.
More broadly, several respondents asked for reporting to be carried out regularly, continuously, frequently or timely without stating a specific time period. Some of these made this general point in addition to expressing a defined frequency, most commonly annual or bi-annual; a few reflected that the availability of data should influence the frequency, progress made, milestones reached, or successes/outcomes achieved. Within these responses, arguments for more frequent reporting included providing openness and transparency, making the issue normalised and accepted, keeping stakeholders up to date and sharing knowledge and experience. Two respondents suggested that reporting cycles should be aligned with other related strategies or science-based targets.
Several respondents felt a range of reporting formats were required. This included detailed suggestions, as well as those that broadly alluded to a wide variety being needed. Their rationale centred on recognition that a range of formats would be needed to engage different audiences and stakeholders.
The most commonly cited format mentioned by several respondents was online/digital, with reference to updates and reporting through social media platforms and websites such as the Scottish Government's website, Adaptation Scotland, and Net Zero Nation. Other online/digital suggestions included blogs, podcasts, dynamic scoreboard, dashboard or interactive platform. Emails and digital signs were mentioned as digital options.
Traditional media such as TV, radio and newspapers were suggested by some respondents, with a few citing that this should be used at a national and local level. Specific proposals included advertising campaigns, video presentation by a professional in the media or widely admired celebrity, or live presentations with input from panel experts and audience Q&A. A few mentioned video as a possible reporting format.
The value of in-person formats was highlighted by a small number. This included briefings from the First Minister, meetings in town halls and local council sessions. Opportunities for face-to-face interaction with stakeholders and the wider public and discussion of annual reports, to provide feedback from practitioners and champions, were praised.
While not technically a reporting format, some respondents described the role of formal and informal networks to support reporting and messaging. Specific examples included: community activists; schools; principal communicators of climate change organisations; community networking channels and bodies; regional councils/COSLA; local climate groups; the Sustainable Scotland Network; and Community Councils.
Engaging all stakeholders
Several respondents highlighted that reporting should be inclusive, accessible and engaging for all stakeholders, in terms of language, presentation and format. This linked to the range of mediums mentioned by respondents and included other ideas such as the use of visuals, infographics, and images and using stories, case studies, and quotes. One respondent felt that local reporting should be aligned with a local issue to ensure relevance; another suggested using lived experience stories of local champions.
Suggested reporting themes were shared by some respondents, including progress against objectives/targets, transferable learning, views of the communities the Government is seeking to engage, explicit pledges, information on new initiatives and plans.
Drawing on good practice
Some shared examples they consider effective. Most common was the approach used in the COVID-19 pandemic, sharing easy to understand information regularly via online dashboards, social media and regular briefings. The Made at Uni campaign was given as a good example of sharing success, and following the reporting model used for the National Performance Framework model was also mentioned. One described a collaborative and responsive platform solution they have developed with partners.
There is a problem
Thanks for your feedback