Scottish household survey 2017: consultation responses analysis

An analysis of responses to the Scottish Household Survey 2017 and beyond consultation.

Executive Summary


As part of Scotland's Spending Plans and Budget for 2016/17, the Scottish Government is required to make savings on the Scottish Household Survey ( SHS) 2017. Savings of the level sought cannot be achieved without significant changes to the design of the SHS.

The Scottish Government sought users and stakeholders' views on two alternative options over the period 15 March to 19 April 2016. The two options were:

  • Option A Biennial topics, i.e. halving the number of topics covered by the survey every year and collecting data on each topic every second year, with a small reduction in sample size (from 10,700 to 10,100).
  • Option B Reduction of the overall survey sample size by around a third, from 10,700 to 7,450, with a small reduction in topics covered by the survey.

Ninety nine respondents from a range of different sectors responded to the SHS consultation. Local government and the third sector formed the largest share of respondents (nearly 70 per cent). The share of responses across sectors were as follows:

  • Central government lead analysts - 15 per cent
  • Local government - 34 per cent
  • Other public sector including NHS and Parliament - 13 per cent
  • Third sector (including HE/ FE) - 35 per cent
  • Other (including students) - 2 per cent

Use of the SHS

Respondents from all sectors reported a wide range of use of SHS data for a variety of purposes. This included to:

  • Develop and inform policy and strategies (around one in three respondents)
  • Monitor and benchmark performance of strategies, policies and programmes or service delivery (around one in three respondents)
  • Planning services and targeting spending, including identifying need (around one in five respondents)
  • Equalities analysis
  • Research

The SHS is used as a major source of data in five formal outcome and performance monitoring frameworks used by public and third sector respondents, as well as in individual local authority ( LA) Single Outcome Agreements ( SOAs) and Partnership Improvement Plans ( PIPs).

The formal frameworks included the:

  • National Performance Framework - ten SHS indicators, nearly one in five;
  • Local Government Benchmarking Framework ( LGBF) - eight indicators from the SHS;
  • Child Poverty Measurement Framework - ten SHS indicators, over a quarter of the total number;
  • Housing and Regeneration Outcome Indicators - 15 SHS indicators, half of the total;
  • Active Scotland Outcomes Framework - 10 SHS indicators, over half the total.

The SHS also forms a significant input into other major analytical outputs that are used across sectors, including the National Records of Scotland's household projections and NHS Health Scotland/ ScotPHO public health community profiles. The former are used by local authorities to determine need and allocate housing budgets, whilst the ScotPHO profiles are used to understand and monitor public health, including inequalities.

In terms of topic use, each topic from the survey was used to some degree by all of the sectors. The most used topics were the key characteristics and health and disability sections of the household and random adult surveys which were used by around half of respondents. Recycling and climate change questions were the least used questions, but were still used by over 20 per cent of respondents.

Over half of respondents noted that there were no alternative data sources to the SHS. The topics listed as having no alternative sources included the travel diary, fuel poverty and energy efficiency, cultural attendance and participation, discrimination and harassment, amongst others.

The Convention of Scottish Local Authorities ( COSLA), the Society of Local Authority Chief Executives ( SOLACE), the Local Government Benchmarking Framework Board, the Improvement Service, and the Accounts Commission, plus several local authorities themselves, felt strongly there was no reliable alternative to the SHS in terms of providing consistent comparative data across local authorities, particularly when it came to the satisfaction with services data for the Local Government Benchmarking Framework ( LGBF). The statutory nature of the latter was noted.

Of those that noted alternative data sources, most stated that none of the alternatives fully met their needs in the same way as the SHS. For example, UK sources did not have a high enough sample size to meet the demand for sub-Scotland level analysis such as equalities. The Census was cited as an alternative but data quickly becomes out of date due to its ten year cycle. Some respondents identified alternatives that were actually based on SHS data such as the National Records of Scotland's ( NRS) household projections.

Locally collected data, such as citizen panels, user surveys or local house condition surveys, were mentioned by around four in ten local government responses.

Views on options for 2017

46 per cent of respondents preferred option A (biennial topics), whilst 39 per cent preferred option B (cut in sample size). 15 per cent decided not to select a preference; 9 per cent of all respondents specifically stated they did not prefer either option, whilst 6 per cent did not answer the question.

There were some differences between sectors in option preferences with local government respondents that expressed an option preference being split equally between option A and option B, 35 per cent each (21 per cent stated neither option and 9 per cent did not answer). 53 per cent of central government respondents preferred option B, compared to 40 per cent for option A. The position was reversed for the third sector with 57 per cent preferring option A and 34 per cent option B.

Option A

The main reason for those that preferred option A was the maintenance of the high sample size and therefore 'robustness' of the data, with this seen to be more important than data on an annual basis.

Respondents wanted to maintain a high sample size due to the need to maintain the current precision around national and local authority level estimates and ability to measure rarely occurring characteristics in a robust way (e.g. volunteering) and to undertake specific sub-group analysis such as equalities analysis.

Other reasons for preferring option A included: no loss of topic or question coverage; simpler to analyse performance and identify change over time, especially for local authority level data compared to option B, and there's a slow change in some figures over time anyway.

In terms of the impact of option A respondents highlighted the impact on performance monitoring frameworks, particularly on the NPF, the LGBF and SOAs. Several local government respondents felt very strongly about the loss of annual data for the LGBF in particular. However, a few local government respondents felt more sanguine and noted that option A was a reversion back to the situation pre 2012 when local authority data was only available on a biennial basis.

Other issues noted with option A included a negative impact on the ability to assess and evaluate the impact of particular policies. This was due to the lack of a corresponding baseline and first year for measuring any change.

Several respondents also raised issues with having to combine non-consecutive years' worth of data and the lower sample size achieved over a 2 year period compared to option B (10,100 household sample size compared to 15,000 under option B). This was particularly an issue for national level equalities analysis, the data that is being considered as a successor to the SG's housing SCORE data for social housing tenants, and adaptations to support independent living and transport modelling and planning. The latter two would require the pooling of 3 years' worth of data over a six year non-consecutive time period.

Split topics and the loss of functionality to explore relationships and outcomes across all variables was also noted. As a result of this, NHS Health Scotland noted reduced capacity to examine inequalities in the social determinants of health.

Option B

The main reasons respondents preferred option B was the retention of data collection and availability of data on an annual basis for the majority of topics. This was felt to be of particular benefit to the NPF. Another reason was that option B would provide a larger sample size, and hence higher level of precision, over a two year period than option A.

Other reasons that option B was preferred included:

  • maintaining the full functionality of the survey (in terms of cross-tabulations and ability to analyse relationships and outcomes),
  • there would be an increase in the sample size for 'one third sample size' questions.

Some of those who preferred option B also noted that measuring and detecting change over time would be less complex than option A where there would be gap years. They noted the loss in precision in comparing annual estimates, but argued that trends and real change are best assessed over a number of years without any gap years in the data.

In terms of the negative impacts of option B, around four in ten local government respondents either noted limitations with the current sample size or that it was already too small, and that option B would only exacerbate this. Reference was made to Improvement Service analysis on the LGBF satisfaction with services indicators. This showed that 3 year rolling averages would be needed to deliver the 'required level of precision' at a local level and for the general population rather than service users.

Nevertheless, a few local government respondents noted that the larger two year sample size (compared to option A) and that the increase in reported precision offered by two year rolling averages was useful. It was recognised by these respondents that such averages would make it easier to identify differences between local authorities, but would make it more difficult to identify change in the short term within an local authority.

In terms of the loss of interview time under option B, several third sector respondents worried about the loss of questions on their topics of interest, whilst several respondents spread across different sectors noted that they may need to commission alternative data source if their questions of interest were dropped from the SHS.

Several respondents expressed a general concern about the possible impact on sub-group breakdowns on an annual basis (without considering pooling a 2 year sample). There were mixed views on the impact of option B on equalities analysis which in turn influenced option preferences.

Equalities analysis under options A and B

The Scottish Government equalities analysis team had a slight preference for option A as this would mean more precise annual counts of equality groups. However, the Equalities and Human Rights Commission ( EHRC) and NHS Health Scotland preferred option B for equalities analysis, as did Sport Scotland for the equalities analysis of the Active Scotland Framework. This was due to the ability to pool two consecutive years' worth of data, which would allow finer equalities breakdowns, and/or a greater level of precision compared to option A.

Implementation of option A

The majority of people did not state their preferences for biennial topic coverage in 2017 vs. 2018 should option A be implemented, despite being asked. Some respondents took the opportunity to specify topics they would like to see asked in both years, whilst others noted that it would be important to maximise the opportunities for key cross tabulations by carefully considering which topics should go together in odd and even years.

Implementation of option B

On option B and how to achieve the reduction in topic coverage, 30 per cent preferred the option of introducing more biennial topics and questions, closely followed by the option of reducing the breadth of larger topics (just over a quarter). Cutting topics and introducing more one third sample questions were the least popular options for respondents to this question (6 per cent and 11 per cent respectively). Close to a third of all respondents did not answer this question.

Under option B, over half of all respondents preferred a two year rolling average basis for the production and publication of local authority estimates every year. 29 per cent of respondents did not answer the question, whilst 17 per cent preferred a two year basis every two years.

Looking ahead

In terms of any further reductions that might be required, a reduction in topic coverage was the preferred way to achieve savings (22 per cent), closely followed by a reduction in the frequency of data collection (18 per cent) and 'other' (19 per cent) most of whom preferred a combination of options. A reduction in sample size was the least popular option. However, it should be noted that 30 per cent of respondents did not provide a preferred way to achieve savings. A number of these noted that further reductions would erode user confidence in the survey.

Respondents were asked what the impact would be of further reductions if the SHS sample, frequency of results or topic coverage were reduced. Many respondents answered in a generic manner given the lack of detail on specific reductions. The biggest concern was a loss in precision in the SHS (around one in five) if the SHS sample size was reduced further. This in turn would reduce the use, value and confidence in the SHS which could (and in some cases would) force users to stop using the SHS altogether and seek out alternative sources.

Other comments on the consultation and/or the SHS

Letters from the LGBF Board, Accounts Commission, COSLA and SOLACE (preference for neither option), together with some responses from individual local authorities, expressed concern about the scope of the consultation in terms of its focus on options A and B. Many respondents called for a 'pause' in order to carry out a 'fundamental' review of the SHS in the context of the wider SG survey landscape.

In terms of other comments on the consultation and/or the SHS, the third sector called for a larger and more robust survey, whilst local government mainly wanted to prevent any further reductions to the sample size of the survey. Several of the latter respondents also noted limitations in the current local authority sample sizes.


Back to top