Information

Scottish Parliament election: 7 May. This site won't be routinely updated during the pre-election period.

Building standards - operating and performance frameworks for verifiers: research summary

Summary of iterative research and development work undertaken by Pye Tait Consulting.


5. Existing KPOs – specific considerations

The following observations and considerations have emerged from a review of documentary evidence, interviews with Review Group members, and online feedback from local authority verifiers across Scotland.

It should be noted that the forward considerations are not ‘accepted’ positions, but offer a menu of options to take to the workshops with verifiers in November 2024 ahead of working towards more concrete textual changes to the two framework documents.

5.1 KPO 1 considerations

Current title: ‘Minimise time taken to issue a first report or to issue a building warrant or amendment to building warrant’

KPO 1 starting position and issues

The original rationale for KPO 1’s focus on ‘time taken’ was due to this being especially important to customers. Review Group members also acknowledge that maintaining consistency and predictability across Scotland is important for projects of similar complexity.

However, the Grenfell Phase 2 Inquiry (discussed in section 3.2) presents a case for revisiting the focus on timings. Furthermore, there is some concern among Review Group members and wider verifiers that KPO 1 performance targets appear to assume that building projects are “off the shelf” and “predictable”.

While the 20-day target timescale to issue a building warrant is generally considered suitable for simpler projects, it is not viewed as a good fit for more complex projects where the time spent reviewing building warrant applications depends on the level of complexity and risk, which it is felt should be reflected in the performance targets.

Staff resourcing concerns, training and wider work activities can reportedly hinder verifiers’ ability to meet performance timescales, which they say can be masked in the reporting.

It is also noteworthy that a mismatch currently exists between statutory timescales (the ‘backstop’) for issuing a building warrant and PF target timescales, and whilst verifiers generally accept the PF, they could in theory revert to the backstop position as defined in legislation.

Taking into account the findings of the Grenfell Phase 2 Inquiry and with compliance at the forefront, it is considered important that verifiers are empowered to state to applicants or their agents how long is realistically needed to process a building warrant application.

“There’s pressure to deliver things within timeframes – I don’t think that makes for a good system…The government set up a system for online applications and made no provision for local authorities to help them adjust.”

Local authority verifier

KPO 1 forward considerations from Review Group member interviews and the online feedback tool (stage 1)

  • Adjusting performance targets to refer to the percentage of building warrants issued within their individual target date to ensure flexibility for more complex applications, including HRBs.
  • Defining simple and complex projects using a “sliding scale of target dates” based on particular criteria (one suggestion that 35 days to issue a first response for a HRB might be appropriate).
  • Extending the target days to account for complex/higher-risk projects, e.g. the 20-day target could be changed to 25 or 30 days, and the 10-day target could be changed to 15 or 25 days.
  • Adding sub-categories for the ‘over £1 million’ value of work band.
  • Additional reporting from verifiers where projects take longer than originally expected, especially high-profile projects where there is strong public interest.
  • Given that not all verifiers are utilising customer agreements, extending guidance, potentially introducing a time limit (“up to six weeks” was suggested) and/or changing the language as the customer may not be in ‘agreement’ but the verifier might deem a particular timescale appropriate due to its complexity.
  • NB: May be necessary to consider any implications of removing timescales from KPO 1 for the compliance capacity calculator (referenced in relation to KPO 5, below).
  • Providing clarity around what counts as ‘Day 0’ and ‘Day 1’ as this can be interpreted differently (issue previously identified in Pye Tait Consulting’s previous Data Standards research for BSD).
  • Reporting the time taken for the applicant to respond to the first report, as this would provide a better understanding of verifier performance.

KPO 1 key takeaways from the workshops (x2) with local authority verifiers (stage 2)

  • General agreement that more complex/high risk applications need additional time in the interests of compliance and should not be subject to blanket KPO timescales.
  • Question raised about how to define ‘complex’ applications and general agreement that verifier judgment should determine this, especially in recognition of the verifier role needing to more strongly reflect its regulatory function relative to being a service provider based on the Grenfell Phase 2 inquiry report.
  • Mixed practice at present between local authorities in the extent to which customer agreements (CAs) are used for more complex applications.
  • One verifier said their threshold for CAs is any application over £250,000 and will agree with the customer an extension to the 10 days.
  • Suggestion to have complex applications (akin to CAs) as a separate category to standard applications and to measure verifiers’ performance against what they said they would achieve for those applications.
  • Suggestion that it would be useful for those using CAs to share best practice.
  • Whilst the current targets for standard applications are considered OKfor most applications, these are dependent on resources, speed of applicant/agent response to queries and quality of the application, and can lead to safety aspects being sacrificed for more complex applications – not only high-risk buildings (HRBs) but others e.g. some conversions – so would like to see some modest relaxations.
  • At the same time, there is recognition of the need not to relax timings too much to risk complacency given so much work to date around performance to meet customer expectations.
  • General view that performance target 1.1 should be changed from either 95% to 90%; or from 20 days to 25 days, while performance target 1.2 of 10 days should be relaxed.

KPO 1 proposed changes based on the workshops (x2) with local authority verifiers (stage 2, informing the Review Group workshop at stage 3)

1. Adjust performance targets to measure the number of first reports and building warrants issued within the target date – in two categories (wording TBC):

  • ‘Standard’ applications (see revised performance targets below)
  • ‘Complex’ applications (days set by verifier on a case-by-case basis)

2. For ‘standard’ applications:

  • Extend the 20-day performance target to 25 days
  • Extend the 10-day target to 15 days

3. Change the language ‘customer agreements’ to something more suitable as not all customers may be de facto in agreement (wording TBC).

4. Additionally, report the time taken for the applicant to respond to the first report, as this would provide a fuller picture of factors influencing verifier performance.

5. Additionally, report contextual information where total time taken by the verifier is longer than expected (criteria TBC).

6. Guidance needed on the rationale for the two categories set out above.

7. Guidance needed on what counts as ‘Day 0’ or ‘Day 1’.

5.2 KPO 2 considerations

Current title: ‘Increase quality of compliance assessment during the construction processes’

KPO 2 starting position and issues

Given the increased focus on compliance, it is important that KPO 2 is appropriately moulded going forward.

Although a small minority of verifiers feel that the current KPO 2 processes are working well, most would like to see this reviewed and improved.

A conceptual challenge for verifiers is that the KPO currently relies on actions outside verifiers’ control, notably the “perennial issue” of verifiers not being notified about construction activity in line with Construction Compliance and Notification Plans (CCNPs). This led some verifiers to express concerns about not being able to resource more on-site inspections and that remote verification inspection (RVI) is not necessarily as robust (one verifier suggested it could lead to “falsification”).

Linked to this, there are known to be inconsistencies between local authorities in the extent to which CCNPs are used and what is set out within them, as well as inconsistencies in KPO reporting (see below).

Extract from BSD’s ‘Data Standards in Building Standards’ report (Pye Tait Consulting, 2022)

Of widespread concern is KPO2 and how to report fulfilment of Construction Compliance and Notification Plan (CCNP) actions with dates[2]. The Uniform system records a list of CCNP actions for each application. There are also two columns with dates – one is date notified (i.e. by the applicant/agent) and the other is the verifier check date. There may be several CCNP actions so the verifier would expect the applicant to notify the verifier on these dates, e.g. commencement of work, and the verifier to enter this information along with the date and check date.

However, in some cases the applicant may fail to notify the verifier, especially on commencement. That means the local authority doesn’t inspect the site and doesn’t complete the notified data and check date. But on a subsequent visit, an inspector might see that an action has not been checked. If the officer is satisfied after inspection that there are no issues, there appears to be variations as to whether or not the verifier retrospectively fills in the check date for the missed action, resulting in BSD getting inconsistent and potentially inaccurate data.

One verifier raised the question as to what should constitute a valid notification and a valid response to a notification for recording purposes. Separately, several verifiers called for greater transparency over reporting protocols and the opportunity to record why a CCNP action/check has failed.

KPO2 is a major bugbear. Some verifiers will complete the missing date field and some will leave it blank. Others might say they cannot issue a completion certificate acceptance if something wasn’t done.”

Local authority verifier

Review Group members broadly agree that this KPO should not include performance targets due to: i) inappropriateness of setting a target that does not support full compliance; and ii) lack of a clear quantitative baseline on instances of non-compliance. In the absence of performance targets, it is important to BSD that focus is not lost from compliance.

KPO 2 forward considerations from Review Group member interviews and the online feedback tool (stage 1)

  • Keeping KPO 2 “simple”, i.e. the successful achievement of compliant building work rather than necessarily the timescales involved.
  • Focusing on what is within the verifier’s control, e.g. total compliance actions and how many were completed/not completed as planned, thus showing the extent to which verifiers fulfil their stated commitments.
  • Reporting the number of physical versus remote inspections, and time on site.
  • Ensuring KPO reporting is proportionate to risk, i.e. more simplified reporting for lower risk projects.
  • Taking into account the work of the Compliance Capacity Working Group, which is trialling time recording methods to report on increased compliance activity and ensuring fees are reinvested.
  • In the absence of CCNPs being made mandatory (which would have legislative implications), providing more guidance relating to CCNP reporting should this be retained, including what is meant by “fully achieved”.

KPO 2 key takeaways from the workshops (x2) with local authority verifiers (stage 2)

  • General buy-in to the idea of keeping KPO 2 “simple”, i.e. the successful achievement of compliant buildings.
  • Recognition that the key focus should be on measuring identified instances of non-compliance, reasons for CCNP failure and corrections needed.
  • General agreement that setting of performance targets for KPO 2 is not appropriate given achievement of compliance should be sought for all buildings.
  • General view that the KPO reporting should focus on what is within the verifier’s control (data reporting point 2.4) since if verifiers are not notified in line with the CCNP, it is not clear what they can do.
  • View that performance should only be measured for cases where verifiers are notified in line with the CCNP (similarly, a view that data reporting point 2.2 “goes out the window” when verifiers are not notified in line with the CCNP).
  • However, several verifiers believe reporting should highlight industry failures, i.e. CCNPs not responded to should be reported as a red flag.
  • Lack of current clarity on what is meant by a CCNP being “fully achieved”, leading to inconsistencies in reporting, e.g.
    • some inspectors might retrospectively complete a checked date after finding no issues, even if the verifier was not originally notified in line with the CCNP leading to inconsistencies in reporting
    • some verifiers refuse to report non-notification at key stages and retrospectively amend the data to avoid this counting against them.
  • General view that it is difficult to propose changes to KPO2 until the outcomes of the Futures Board compliance plan are fully known, e.g. the role of the CPM for HRBs and implications for CCNPs; as such there is a view that KPO 2 may need to be revisited again.
  • Wider point: concern that compliance issues are an endemic industry problem and that customers need to be better educated to respond to the CCNP and why it is important, or CCNPs being a mandatory part of the process.

KPO 2 proposed changes based on the workshops (x2) with local authority verifiers (stage 2, informing the Review Group workshop at stage 3)

1. Reword the purpose to make clear the focus of the KPO is on the achieving of compliant buildings.

2. Focus reporting on what is within the verifier’s control, therefore drop 2.2. and 2.3. in the data reporting to ensure only the verifier’s performance is measured.

3. Additionally, report on the number of identified instances of non-compliance and resulting actions, e.g. via:

  • total inspections (possibly break down by in-person and RVI; suggestion to include time on site)
  • total compliance failures identified from Reasonable Inquiry (possibly by category – definitions TBC)
  • total corrective actions and nature of these (categories TBC).

4. To discuss with Review Group: should there be simplified reporting for lower risk projects? If so, how to define/categorise?

5. Clearer guidance needed on what is meant by CCNPs being “fully achieved”.

6. Possible guidance needed on time recording based on work undertaken by the Compliance Capacity Working Group.

5.3 KPO 3 considerations

Current title: Commit to the building standards customer charter

KPO 3 starting position and issues

Review Group members are generally of the view that this KPO works well, as are all verifiers responding to the online feedback tool. However, a minority feel it could be removed since it raises the following questions and issues:

  • Should a customer charter be required for building standards if the local authority already has an overarching charter in place?
  • Would it be better to recognise standards that go above and beyond, such as Customer Service Excellence (CSE)?
  • Are customers actively interested in seeing a customer charter?
  • Is technology moving beyond the need for a customer charter, for example use of chat bots and other interfaces that signpost important information to customers?

KPO 3 forward considerations from Review Group member interviews and the online feedback tool (stage 1)

  • If maintaining KPO 3, reviewing the template to ensure it is still fit for purpose.
  • Potentially merging with KPO 4 (being mindful that customers should have access to consistent information and with signposting to the local authority’s wider charter) or KPO 7.

KPO 3 key takeaways from the workshops (x2) with local authority verifiers (stage 2)

  • Whilst there is a general acceptance that the KPO works well, there are questions raised relate to its continued appropriateness, namely:
    • whether a customer charter is needed if the local authority already has an overarching charter in place
    • whether customers are actively interested in seeing a customer charter
    • whether technology is moving beyond the need for a customer charter, e.g. use of chat bots and other interfaces that signpost important information to customers.
  • Workshop participants noted that the customer charter is something of an outdated concept, does not recognise verifiers going the extra mile (e.g. having CSE, which is in fact covered in KPO 4), and that local authorities may have their own charter.
  • One verifier mentioned that they have recorded no website hits on the charter in five years and another only two hits.
  • Whilst some verifiers are content with KPO 3, there is a general sense among workshop participants that this is not needed as a standalone KPO and could either be removed or merged with either KPO 4 or KPO 7 (most deeming the latter more suitable) as a commitment to maintaining it.

KPO 3 proposed changes based on the workshops (x2) with local authority verifiers (stage 2, informing the Review Group workshop at stage 3)

1. Option 1: Remove KPO 3 and incorporate the core requirement into KPO 7 (wording for the revised core requirement under KPO 7 TBC). Then potentially move the detailed contents of KPO 3 into a PF annex, i.e. providing guidance on what is needed for the customer charter so this is not lost.

2. Option 2: Keep KPO 3 unchanged.

5.4 KPO 4 considerations

Current title: Understand and respond to the customer experience

KPO 4 starting position and issues

The introduction of a national customer satisfaction survey for building standards in 2013/14 sought to establish a nationally consistent and impartial approach to measuring customer performance across Scotland.

Most Review Group members are content with the current iteration of this KPO in monitoring quality of the customer experience. There are no reported concerns about the current target of 7.5 out of 10 for overall satisfaction and one verifier responding to the online feedback tool notes that the ability to remove “erroneous” feedback is well received.

However, there is some concern that low response rates from the survey can affect the usefulness of feedback and that local authorities then have a burden of chasing feedback. It is also noted that feedback is reportedly less common away from the “extremes”, i.e. it is felt that most customers who are generally content do not provide feedback, while negative feedback can disproportionately affect the scores of smaller local authorities with fewer responses.

“The main problem is that it’s set up as a national survey. I think within the last three months we’ve had two responses. That’s not really true feedback. We tend to get our own feedback from agent or architect events. KPO4 needs to be looked at, it’s very limited at the moment.”

Local authority verifier

It should be noted that many of the above issues have been raised previously, however, the value of the national survey and transparency of reporting has historically outweighed concerns about response rates.

Several Review Group members emphasised the issues discussed in section 3.2 about the importance of focusing more strongly on verifiers as regulators and bastions of compliance rather than providing a service.

A minority suggested that this KPO should no longer be incorporated.

KPO 4 forward considerations from Review Group members interviews and the online feedback tool (stage 1)

  • Above points to be discussed further at the forthcoming workshops.
  • Reporting could be based on more recent survey response data in order to be as up to date as possible.

KPO 4 key takeaways from the workshops (x2) with local authority verifiers (stage 2)

  • General agreement that KPO 4 is valid and should remain – the main issues raised relate to more underling conceptual issues with the survey mechanism outside the scope of the PF review.
  • No notable concerns raised with the performance target as currently set.

Key points raised (though these would not appear to directly affect the KPO)

  • Response rates are small across some (especially smaller) local authorities.
  • Customers are more likely to respond to the survey at the extreme ends of the scale.
  • Where response rates are low, a negative response can have a more adverse effect.
  • Some negative responses can relate to another department, e.g. planning, or be down to the decision made on their application rather than the customer experience itself (although this is intended to be mitigated through survey question wording).
  • Promoting the survey can be challenging/burdensome.
  • View that measuring complaints (e.g. compliance-related) may be a better approach as complaints can be upheld or rejected.
  • One verifier questioned whether a target was necessary.

KPO 4 proposed changes based on the workshops (x2) with local authority verifiers (stage 2, informing the Review Group workshop at stage 3)

1. No clear proposed changes (issues raised related to wider questions about the delivery and robustness of survey itself are outside the scope of this review).

2. Introduce a measure relating to number of complaints received and upheld relating to compliance.

5.5 KPO 5 considerations

Current title: Maintain financial governance

KPO 5 starting position and issues

Review Group members largely agree that maintaining financial governance is a critical part of a verifier’s overall performance and a positive aspect of the framework, specifically:

  • KPO 5 is a useful tool to show reinvestment within the system.
  • Reporting is generally manageable.
  • It is important to be sure that the costs of running each local service are covered.
  • Good financial governance is a necessary requirement of any public-funded body.

Building warrant fee increases from April 2024 are intended to support building standards services to prepare for a growing compliance role over the coming years, though Review Group members pointed out that this money is not always reinvested into the local building standards service as hoped and some local authorities redistribute this money elsewhere.

“We’re currently seeing a quarter of our fees sucked into a blackhole. BSD wants more staff in and to deal better with compliance and control – but that isn’t always straightforward.”

Local authority verifier

Additional monitoring and reporting of building warrant fee reinvestment will be required to support Ministerial decisions on increasing fees in subsequent years.

BSD and Review Group members feel the main goal of KPO 5 going forward should be to establish whether there is sufficient money to deliver building standards services with a strong compliance focus. A shorter-term goal would be to establish the gap between what is currently happening and what needs to be done.

The idea is that additional income from building warrant fees could be invested in staff, training or technology so that teams can undertake more Reasonable Inquiry during the post-building warrant period, strengthen compliance and strengthen the building standards system as a whole.

There is some concern about the complexity of reporting, with too much granularity leading to a greater risk of subjectivity and irregularity.

BSD has developed a Compliance Capacity Calculator to assist verifiers in identifying any compliance capacity gaps within their teams and it is felt that the SBSH can play a key role for supporting those working within the service.

Several Review Group members noted that the current process of time recording is leading to “best guesses” and potential inconsistencies and that a new system of time recording would lead to greater efficiencies.

It is understood that ongoing work is looking at a national time recording scheme which could be used to evidence outcomes that are supported by an increase in fees and combined with increased auditing of local authorities. It is also understood that SBSH is/will be surveying verifiers regarding time recording, looking at possible concerns and implications.

Two verifiers responding to the online survey consider it unfair to use RAG ratings to measure fee income against the costs of running the service since they feel income is outside of their control. One mentioned that cost reporting can be difficult where a full breakdown is not maintained locally.

Extract from BSD’s ‘Data Standards in Building Standards’ report (Pye Tait Consulting, 2022)

With respect to gathering and reporting financial information for KPO5, there would appear to be variations in how verifiers determine which overheads and costs should be associated with the building standards service. When trying to be consistent over assessing costs of works, there is a call for a national and up-to-date model. Verifiers could be using different or outdated price indexes. Since the price of materials has increased considerably in recent years, that could have a big effect on the estimated cost of works for applications.

“There will be inconsistencies on how staffing costs are calculated. Should the head of service, executive director or CEO (with high salaries) be included if they cover other duties besides verification? We don’t include those, but some will. That could make it look like you have more staff than you really have and less of a surplus in your fees. If you want to keep the wolves from your door it’s better not to show a surplus in your fees hence why some local authorities will interpret the data to best suit their service.”

Local authority

KPO 5 forward considerations from Review Group member interviews and the online feedback tool (stage 1)

  • Through changes to the KPO, helping to ensure that necessary resources including fees, employees, IT and other infrastructure are being invested into the building standards service.
  • Ensuring that investment of fees back into the service is measured effectively and with sufficient detail to report to Ministers.
  • Defining a clear means of calculating fee income relative to reinvestment.
  • Recognising surplus income using RAG criteria (e.g. where verifiers are recording over 130%), so if verifiers are recording a surplus but performance is lower than expected then that can be identified as a cause for concern.
  • Gathering information from verifiers (as part of their annual report) about how they plan to reinvest going forward if they have a surplus, to inform future planning.
  • Providing clearer guidance in areas such as: i) what is included as part of ‘staff costs’; ii) what constitutes an ‘inspection’ (physical, virtual, monitoring paperwork); iii) the cost of enforcement; iv) how to account for costs associated with training, mentoring, new technologies etc.
  • Not scoring the KPO, since fee income levels are deemed outside the control of the verifier. (*)

* It should be noted that KPO 5 is not currently scored and that fee income is outside the control of verifiers in terms of construction activity, but as it is the local authority that is appointed as a verifier, the use and allocation of available fees is wholly within its control.

KPO 5 key takeaways from the workshops (x2) with local authority verifiers (stage 2)

  • It is noted that additional monitoring and reporting of building warrant fee reinvestment will be required to support Ministerial decisions on increasing fees in subsequent years.
  • General concerns raised that verifiers are unable to fully retain building warrant fees within their service.
  • Costs could be more fully and clearly defined, spanning different cost categories such as training, CPD, mentoring.
  • Costs might potentially include duties out with verification, including cost of enforcement, licenses etc. where this has to come from the same budget to cover the reduction in the grant aided expenditure (GAE) allocation to the local authority.
  • Clarity needed on genuine reinvestment versus inflationary consequences. For example,. salaries may increase but income remains the same.
  • RAG criteria could be adjusted to capture performance and flag issues, e.g. it can raise eyebrows if fee income >130% of costs but a local authority is underperforming.
  • Concern that the BSD’s focus on reinvestment is at odds with local authorities suffering a “funding crisis” with performance taking a hit.
  • Concern that a focus on investing in training staff can have a paradoxical effect, i.e. pushing performance down.
  • Observation that the current ‘requirements of verifiers’ for this KPO places emphasis on efficiency savings but nothing on reinvestment – that should be switched up.
  • View that time recording could be useful but concern at present about major variations in the granularity of reporting.
  • General agreement that analysis of data should take into account the wider picture, e.g. surplus income may be received in one period and reinvested in a future period that could make that future period look like an underperformance.

KPO 5 proposed changes based on the workshops (x2) with local authority verifiers (stage 2, informing the Review Group workshop at stage 3)

1. Reword the purpose to make clear that the main goal of KPO 5 is to ensure sufficient money to deliver building standards services with a strong compliance focus and establish the extent to which that is currently happening in practice.

2. Raise the profile of “fee reinvestment” within the wording of this KPO (including “requirements of verifiers” where currently not mentioned) and reduce emphasis on efficiency savings.

3. Consider how RAG criteria could be adjusted in relation to aspects of performance to highlight possible issues, e.g. fee income >130% costs but underperforming in other areas could raise questions (Review Group input needed here).

4. Include clear guidance (Review Group input needed) on:

  • what is included as part of ‘staff costs’
  • what constitutes an ‘inspection’ (physical, virtual, monitoring paperwork)
  • calculating enforcement costs
  • how to account for costs associated with training, CPD, mentoring, new technologies etc.

5. Additionally, report how verifiers plan to reinvest if they have a surplus (alternatively within KPO 7 – TBC).

5.6 KPO 6 considerations

Current wording: Commit to digital services

KPO 6 starting position and issues

Most Review Group members and verifiers responding to the online feedback tool feel that this KPO can be reasonably removed as eBuilding Standards has now helped to embed digital services as a business-as-usual approach.

Three verifiers mentioned that continuous development of digital systems and platforms is essential, so they remain up-to-date and fit for purpose.

One verifier mentioned that KPO6 should be retained as it represents a commitment to “keeping up with the times.”

Another feels there should be greater transparency around the number of electronic building warrant submissions.

KPO 6 forward considerations from Review Group member interviews and the online feedback tool (stage 1)

  • Above points to be discussed further at the forthcoming workshops.

KPO 6 key takeaways from the workshops (x2) with local authority verifiers (stage 2)

  • Consensus for this KPO to be removed.
  • Consideration should be given to reflecting commitment to digital services elsewhere.

KPO 6 proposed changes based on the workshops (x2) with local authority verifiers (stage 2, informing the Review Group workshop at stage 3)

1. Remove.

2. Consider wording tweaks to ‘digital services’ section of PF on page 7 and/or moving this into ‘continuous improvement’.

5.7 KPO 7 considerations

Current wording: Commit to objectives outlined in the annual verification performance report

KPO 7 starting position and issues

Most Review Group members, along with most verifiers responding to the online feedback tool, consider KPO 7 to be fit for purpose. They see the annual report to be a demonstration of commitment to quality, helping them to reflect, plan ahead and shape their services.

A minority feel that KPO 7 is not necessary or is picked up through other reporting.

KPO 7 forward considerations from Review Group member interviews and the online feedback tool (stage 1)

  • Consider merging with KPO 3 due to perceived overlap.

KPO 7 key takeaways from the workshops (x2) with local authority verifiers (stage 2)

  • General agreement that KPO 3 should be merged within KPO 7.
  • Suggestion that the annual verification performance report could include the customer charter, i.e. showing “what we should be doing” (charter) and “what we are doing” (annual report).
  • Suggestion from one workshop group that annual performance reports should be signed off by senior member of the local authority to demonstrate joined up commitment and elevate the importance of the role of verifiers.

KPO 7 proposed changes based on the workshops (x2) with local authority verifiers (stage 2, informing the Review Group workshop at stage 3)

1. Incorporate the customer charter requirement into the annual performance report requirements.

2. Annual performance report to be signed off by a senior executive within the local authority so they are aware of commitments (to be discussed).

5.8 Wider observations/considerations

  • Place more focus on compliance, the regulatory function and competence – which cannot be compromised – and less focus on performance targets.
  • Consider the frequency of reporting in general to minimise the burden on verifiers, e.g. reducing from quarterly to six-monthly.
  • Ensure that data being asked for has a clear purpose and use - especially at a granular level, such as splitting into types of application and bands for KPO1.
  • Ensure the PF, including guidance notes, are as clear as possible to minimise the risk of requirements being interpreted in different ways.

5.9 Summary of KPOs that could be removed or merged

Feedback suggests:

  • KPO 6 to be removed.
  • KPO 3 to be merged with KPO 7.

Contact

Email: buildingstandards@gov.scot

Back to top