Building standards - verification service: customer experience evaluation - future model

This research identified and proposed a preferred model which the Scottish Government (Building Standards Division) could use to deliver the national customer survey for building standards.

This document is part of 2 collections


1. Introduction

Building standards in Scotland

1.1 The Building Standards system in Scotland was established as a result of the Building (Scotland) Act 2003. Its remit is to protect the public interest by setting out the standards to be met when building or conversion work takes place, to the extent necessary to meet the building regulations.

1.2 The principal objective of the verification system in Scotland is to protect the public interest by ensuring compliance with the Building (Scotland) Regulations 2004. Verifiers are appointed by Scottish Ministers and the Act provides for a variety of verifiers should this be required. At present the only appointed verifiers are the 32 Scottish local authorities, each covering their own geographical area.

1.3 Building standards verifiers work in accordance with the Terms of Appointment; Operating Framework; and Monitoring and Auditing Requirements.

1.4 The Operating Framework (2021) comprises three strands:

  • Integrity and Operational Resilience;
  • Administration of Building Warrant Applications and Completion Certificate Submissions; and
  • Maintenance of Records to facilitate effective business operation and periodic audit by the Scottish Government.

1.5 Under the auspices of the Operating Framework, verifiers are expected to meet the requirements of the Building Standards Verification Performance Framework. This was first introduced in May 2012 and updated in 2017 and more recently in in 2021 to ensure consistency and quality of outputs and overall service, along with a greater focus on peer review, benchmarking and best practice sharing.

1.6 The Performance Framework (2021) covers three perspectives:

  • Professional Expertise and Technical Processes;
  • Quality Customer Experience; and
  • Operational and Financial Efficiency.

1.7 There are three cross-cutting themes of:

  • Public Interest;
  • Continuous Improvement; and
  • Partnership Working.

1.8 To enable performance to be measured, the framework comprises nine Key Performance Outcomes (KPOs). Verifiers are re-appointed on the condition they continue to meet these requirements.

Developments in measuring the performance of verifiers

1.9 In 2015, the Scottish Government commissioned Pye Tait Consulting to evaluate the performance of local authorities in their role as verifiers, with an aim to inform Scottish Ministers in the lead-up to the next appointment of verifiers from May 2017. The evaluation identified various considerations including the scope for a review and refresh of the performance framework.

1.10 In 2016, the Scottish Government completed this review in consultation with Local authority Building Standards Scotland (LABSS) and with independent input from Pye Tait Consulting. The 32 local authorities were re-appointed on 1st May 2017 for varying lengths of time based on their prior performance, and those re-appointed for a shorter period were subsequently re-appointed for a further period from 1st May 2020. The new ‘Building Standards Performance Framework for Verifiers’ was also implemented from May 2017, and updated in April 2021. A full review of appointment periods is planned to be undertaken before 1st May 2023.

1.11 In early 2019, the Building Standards Futures Board was established to provide guidance and direction on developing and implementing recommendations made by the Review Panels on Compliance and Enforcement and Fire Safety. This followed notable failings found in the construction of Edinburgh School buildings and following the tragic fire at Grenfell Tower in London. The Board’s programme of work aims to improve the performance, expertise, resilience and sustainability of the Scottish building standards system.

1.12 One such workstream was the 2021 review of the Operating and Performance Frameworks to assist verifiers in assessing their service against requirements. Reviewing and improving how customer feedback is collected and reported can help to ensure that a modernised, reliable and flexible solution can be found.

Measuring customer satisfaction - a timeline

1.13 Prior to the development of Scotland’s first national customer satisfaction survey for building standards in 2013-14, there was no single standard approach. A wide range of methods used by local authorities included continuous or ad hoc paper based surveys; online surveys and/or customer focus groups. This made it virtually impossible for the quality of service to be consistently measured through the use of like-for-like questions and precluded the possibility of longitudinal analysis.

1.14 In 2013, the Scottish Government commissioned Pye Tait Consulting to develop the first national customer satisfaction survey, based on the need to obtain nationally consistent data on customer perceptions of their local authority building standards service. This survey was developed in late 2013/early 2014 and was live in Spring 2014, and successfully obtained baseline data to feed into the customer experience indicators of the verification performance framework and permit future trend analysis.

1.15 Every year since then, the survey has measured and reported on performance at national, consortium and local authority levels, incorporating trend analysis to visually show how performance has moved over the preceding three years. This has been undertaken independently by Pye Tait Consulting.

1.16 Under the model used to date, local authority verifiers have collated and shared contact details of customers from the previous twelve months with Pye Tait. An email (and reminder messages) are then sent by Pye Tait with a link to the survey. Results are subsequently independently analysed and reports produced at national, local, and consortia levels.

1.17 The national survey aligns with KPO 4 of the performance framework – titled ‘Understand and respond to the customer experience’. The purpose of this KPO is for verifiers to monitor customer satisfaction with the building standards service and ensure it meets or exceeds customer expectations.

1.18 In recent years there have been several minor changes to the data collection process, questionnaire content, and how performance is benchmarked. For example, in 2015 Pye Tait Consulting was commissioned to review the existing question set and ensure it was fit-for-purpose. More recently, the introduction of GDPR in 2018 determined that customers could reasonably expect to be surveyed by a third party acting as an arms-length extension to the Scottish Government, meaning higher volumes of customer email addresses could be made available without any prior opt-outs or opt-ins being required. That led to an uplift in response volumes although the response rate has generally remained around 15%.

1.19 In 2020, Acorn Learning was commissioned by the Scottish Government to undertake a review of the national customer satisfaction survey, with the overall aims of establishing the most effective approach to measuring customer satisfaction across Scotland and ensuring the survey is fit for the future to support the Performance Framework for Verifiers. A summary of their findings is provided below.

Strengths and limitations of the current approach

1.20 Data from the annual survey have been invaluable for achieving national consistency; enabling performance monitoring and benchmarking; feeding into Ministerial decision-making regarding the length of future appointment of verifiers; as well as contributing towards continuous improvement and best practice sharing at a local level. Acorn Learning’s research found that the current satisfaction survey is highly valued and plays a vital role in measuring service quality across Scotland, and the ability to compare performance year-on-year is important. The research did, however, note that there is scope for continuous improvement in the way satisfaction is measured.

1.21 One of the limitations of the current approach to measuring customer satisfaction, for example, can be seen in the 2020 survey findings which indicate that some customers might be basing feedback on their most recent experience rather than thinking about the application which occurred in the designated timeframe. There is also the risk that agents (working on behalf of an applicant or submitter) who interact with multiple local authorities do not always differentiate between local authorities.

1.22 Furthermore, there is concern that the response rate (15% in the 2020 customer survey) is low and could be improved. Acorn Learning’s report notes that efforts should be made to improve the range, number, and percentage of response rates from customers, and the question set shortened where possible (while retaining the ability to monitor longitudinal trends).

1.23 Additionally, Acorn Learning’s research to date for the BSD identified the issue of lack of real-time data gathering and suggests making best use of modern technologies including 4G and 5G and mobile devices to potentially overcome this. To illustrate this, the overall customer satisfaction rating from the December 2020 national survey report will feed into quarterly performance returns from January 2021 until December 2021. This highlights a significant time lag between data collection and reporting by local authorities which does not allow the impact or effectiveness of improvements to customer engagement strategies to be assessed in a timely manner.

Research aim and objectives

1.24 The aim of this research is to identify and propose a preferred model which the Scottish Government (Building Standards Division) could use to deliver the national customer survey for building standards. This future model should retain a standardised approach to measuring the customer experience across all 32 local authority verifiers to meet the requirements of KPO 4 of the national performance framework – ‘Understand and respond to the customer experience’. Key outputs will include a revised question set and a plan for how the survey should be deployed and managed.

1.25 Specific objectives of the research are to:

1. optimise the survey questions so customers are more likely to engage;

2. review and agree on any necessary changes to the core measures and metrics to capture and benchmark the customer experience as part of KPO 4;

3. consider a mechanism for more regular data collection (either scheduled or using a more dynamic on-going data collection and real-time reporting tool) to minimise the current time lag;

4. minimise the survey burden for verifiers and customers given that some local authorities still use their own mechanisms to obtain customer feedback;

5. ensure the future survey is accessible and easy to use for customers wishing to provide feedback anywhere and from different types of devices;

6. confirm who will have ownership and control of the survey in future years;

7. continue to make use of secure and robust data collection methods that comply with relevant data protection legislation and market and social research standards;

8. through the above actions, set the conditions for increasing survey response rates so that verifiers can rely better on the survey findings.

Methodology

1.26 This research was split into two distinct phases. The first phase focused on gathering evidence to determine what future options are available and feasible to measure customer satisfaction (via the current or alternative models).

1.27 Evidence to inform the first phase was gathered through a workshop with the BSD Review Group of local authority verifiers, and through six telephone interviews with wider stakeholders. Desk research was also undertaken to supplement this primary research to feed into and inform the development of the three options for a future model presented later in this report, and to identify best practice that can be gathered from elsewhere. This desk research sought to understand:

  • what local authority verifiers in Scotland are currently doing to measure customer satisfaction locally, including good practice and limitations.
  • other approaches for measuring and reporting on the customer experience in other organisations in public, private and voluntary sectors, to look for innovative examples that give fast and effective customer data.
  • the range of survey software and data presentation options (including strengths, limitations, and costs) relating to who might run it, how often, mechanisms for reaching customers, device accessibility, and analysis and reporting and presentation considerations.

1.28 Following the development of the three potential future models, the second phase of the research presented these options in turn to local authority verifiers at a second BSD Review Group workshop, to discuss the relative (non-financial) merits and drawbacks of each option, before discussing what a preferred future model might best look like. High-level views from wider stakeholders were also sought on the three options, and opinions and potential solutions were gathered to address limitations of the preferred model identified by the Review Group’s workshop.

This report brings together all findings from both phases to set out a preferred future model to evaluate the customer experience.

Report structure

1.29 Chapter 2 outlines the issues and considerations which prompted this review and incorporates the findings from the initial phase of the research, i.e. the first workshop with the BSD Review Group and first set of stakeholders interviews, and interweaves the findings from the desk research (including Acorn Learning’s review, where relevant / applicable).

1.30 Chapter 3 provides an outline of how each option would work in practice, along with their respective merits and drawbacks, as noted by workshop attendees and stakeholders, and a conclusion is drawn on the preferred option. Indicative ballpark costing for each option is also provided.

1.31 Chapter 4 outlines key considerations highlighted in pursuing the preferred option, and potential solutions that may help to mitigate these issues to an extent. It then sets out the next steps which are required to implement this option, and outlines key decisions and actions which need to be taken, as well as core considerations to bear in mind when fleshing out this preferred option.

Contact

Email: simon.moore@gov.scot

Back to top