1.1 Background to the survey
The building standards system in Scotland was established under the Building (Scotland) Act 2003. The Act gives powers to Scottish Ministers to make building regulations, procedure regulations, fees regulations and other supporting legislation as necessary to fulfil the purposes of the Act. The purposes include setting building standards and dealing with dangerous and defective buildings.
The remit of the building standards system is to protect the public interest by setting out the standards to be met when building or conversion work takes place, to the extent necessary to meet the building regulations.
The standards are intended to:
- Secure the health, safety, welfare and convenience of persons in or about buildings and of others who may be affected by buildings or matters connected with buildings;
- Further the conservation of fuel and power; and
- Further the achievement of sustainable development.
The role of the building standards verifier is to protect the public interest by:
- Providing an independent check of applications for building warrants to construct buildings, provide services, fittings or equipment in buildings, or to convert buildings;
- Granting or refusing building warrants;
- Carrying out an independent check of construction activities through the process of reasonable inquiry; and
- Accepting or rejecting completion certificates.
Verifiers are appointed by Scottish Ministers and the Act provides for a variety of verifiers should they be required. At present, the only appointed verifiers are the 32 Scottish local authorities, each covering their own geographical area.
In 2011 Pye Tait Consulting, on behalf of the Scottish Government, developed a set of nine national Key Performance Outcomes (KPOs), which were implemented as part of the ‘Building Standards Verification Performance Framework’ and launched on 1st May 2012. The intention of these was, through more accurate and effective comparisons, to ensure consistency and quality in terms of outputs and overall service, along with a greater focus on peer review, benchmarking and sharing of best practice. Additionally, the KPOs underpinned a strong culture of continuous improvement.
In 2013/14 the Scottish Government commissioned Pye Tait Consulting to develop and run the first national customer satisfaction survey for building standards. This was based on the need to obtain nationally consistent data on customer perceptions of their local authority verifier building standards service. The first survey provided baseline data for trend analysis in subsequent years and was repeated in 2015 and each year since then.
1.2 Changes from May 2017
In 2015, the Scottish Government commissioned Pye Tait Consulting to evaluate the performance of local authorities in their role as verifiers, with an aim to inform Scottish Ministers in the lead-up to the next appointment of verifiers from May 2017. The evaluation identified various considerations including the scope for a review and refresh of the performance framework.
In 2016, the Scottish Government completed this review in consultation with Local authority Building Standards Scotland (LABSS) and with independent input from Pye Tait Consulting. The 32 local authorities were re-appointed on 1st May 2017 for varying lengths of time based on their prior performance, some of which were subsequently re-appointed for a further period from 1st May 2020. A full review of appointment periods is planned to be undertaken before 1st May 2023. The new ‘Building Standards Performance Framework for Verifiers’ was also implemented from May 2017.
Two of the seven new KPOs, categorised under ‘Quality Customer Experience’, aim to ensure that verifiers adhere to the commitments in the building standards customer charter and meet or exceed customer expectations. The 2020 survey aligns with KPO4 – titled ‘Understand and respond to the customer experience’. The purpose of this KPO is for local authority verifiers to monitor customer satisfaction with the building standards service and ensure it meets or exceeds customer expectations.
This report presents the findings from the 2020 national customer satisfaction survey.
The 2020 survey questionnaire was identical to the 2019 version (a copy is presented in Appendix 2).
The scope of the survey was all building standards customers between 1st April 2019 and 31st March 2020, defined as:
- Applicants for building warrants (including any agents);
- Submitters of completion certificates (including any agents); and
- Others that have interacted with the building standards service.
Local authorities supplied their customers’ contact details (name and email address only) to Pye Tait Consulting for the express purpose of being invited to participate in the survey. On advice from the Information Commissioner’s Office (ICO) following GDPR coming into force in May 2018, the national customer satisfaction survey is in the legitimate interests of the buildings standards system and its customers. However, prior to GDPR, customers may have previously opted out of being contacted for the purpose of this survey, and local authorities double-checked with these customers if they still wished to opt out of their details being shared with Pye Tait.
The survey opened on 28th September 2020 and closed on 3rd November 2020. It was hosted online and customers with email addresses were directly invited by Pye Tait Consulting to participate. Local authorities were also at liberty to promote the survey to their own customers (i.e. those within scope) as appropriate, with some promoting the survey via social media channels.
When completing the survey, customers were presented with a link relating to the specific local authority verifier to which their response related. Customers of multiple local authorities were presented with links for each local authority verifier of which they had been a customer and thus could complete the survey multiple times, once for each local authority verifier.
A note about the analysis
For most survey questions, the findings contained within this report have been cross-tabulated by type of customer (see Figure 1). It should be noted that the findings have not been subject to statistical tests to determine the significance of any apparent patterns and should therefore be treated as indicative. Percentages shown in charts and tables may not add up to precisely 100% due to the impact of rounding.
Certain charts in this report refer to a base number of ‘respondents’ (meaning total customers answering that particular survey question) and others refer to a base number of ‘responses’ (total boxes ticked for survey questions where customers could choose more than one answer).