3. Methods for evaluating the HES Homecare pilot
To evaluate the pilot outcomes against the aims laid out above, this report draws on evaluation materials collected throughout the HES Homecare pilot. Evaluation activities included: technical monitoring to assess changes in internal temperature before and after the service and a social survey with service recipients to monitor shifts in comfort and behaviours in the home. Both the technical monitoring and social survey were carried out in homes receiving the HES Homecare and the standard CLO service, for comparative purposes. In-depth interviews with those involved in delivering the HES Homecare service, a series of case studies, and a live learning document compiled by the HES Homecare team have also contributed to the evaluation.
The initial ambitions for the evaluation of the pilot were to include a statistically representative sample of residents receiving the Homecare service, and collect data from a matched control group of residents receiving the standard CLO service. Based on initially forecast numbers, it was anticipated that 69 households would be selected from an anticipated 220 households receiving the service. These households would take part in a social survey and have Tinytag monitors installed in order to measure internal temperature both before and after any interventions took place (this included advice and/ or the installation of physical measures). However, the timing of the pilot limited the pool of householders from which evaluation participants could be drawn. Only those that had come into contact with the pilot service by October 2017 were to be included in the evaluation, to allow a sufficient monitoring period before the original end-date of the trial; 45 households had been engaged by October 2017. The control group would be identified from vulnerable rural households across Scotland (excluding the areas included in the trial) receiving the CLO service, whilst the intervention sample would be selected from within those areas receiving the HES Homecare service. The realities of delivering the pilot meant that these numbers were not reached for inclusion in the evaluation. Details of the samples achieved for each different activity are provided in turn below, along with information about the way that this data has been used for the evaluation.
3.1. Technical monitoring
Tinytag monitors were installed by the Energycarers in 14 of the domestic properties; 11 of these received the HES Homecare service, 3 were CLO clients. These monitored internal temperatures and sought to gather information on where the HES Homecare service had an impact on internal temperature. The number of properties monitored is much smaller than the 69 anticipated; through interviews (discussed in Section 3.3) the HES Homecare team highlighted that some participants did not feel comfortable having the tags installed, and the rural nature of the pilot meant that it could be time consuming to install and collect the Tinytags. The recruitment process and required monitoring period limited the number of participants eligible to take part in this aspect of the evaluation. Case Study 6 highlights that changing circumstances through the course of a pilot of this type can also make it difficult to retrieve data gathering equipment of this type.
The properties monitored for the evaluation had a series of intervention points throughout the monitoring period; 30 instances of advice were given across the 14 properties which mostly took place before any measures were installed. The types of physical measures installed across the 14 properties is detailed in Table 3.
Table 3: Measures installed across the 11 Homecare and 3 CLO properties with Tinytag monitoring (* = non-heat related measures)
|Installed Measures||No. of Measures|
|Energy Efficient Glazing/Doors||4|
|Electric Heating Upgrade||2|
|Hot Water Cylinder||2|
The provision of advice in conjunction with physical measures makes it difficult to determine whether any impact seen in the analysis would be from the physical measure or behaviour change following advice given; however, the social surveys detailed below can contribute to understanding of this. Of the 14 properties monitored, 11 were from the Homecare project and the remaining 3 were part of the Community Liaison Officer (CLO) programme.
Only three datasets have been considered for analysis; these were all part of the Homecare trial. The remaining data was invalid because there was either too little pre-installation data, or no heat measures (e.g. boilers, insulation or draught proofing) were installed in the properties monitored (see Table 4). Only temperature sensors were installed, so only internal temperature analysis could be undertaken. Without a full set of heating season data before and after the physical measure is installed it is not possible to accurately assess the impact of an intervention.
Table 4: Analysis status for the 14 properties monitored using Tinytags.
|Analysis Status||No. of Properties|
|Not Analysed (No Pre-installation Data)||7|
|Not Analysed (No Heat Measure Installed)||2|
|Not Analysed (Not Enough Data)||1|
|Not Analysed (No Installation Dates)||1|
3.2. Social surveys
The social survey was developed by the University of Edinburgh's evaluation team. Many of the questions in the survey instrument have been developed in other research locales, including: the Wyndford estate in Glasgow, and in diverse local authorities across Scotland. The survey was adapted to include energy advice for this pilot evaluation in collaboration with those delivering the HES Homecare service. There are two versions of the survey, the first is intended to be completed before they receive the service or intervention (Time 1) and the second is to be completed after (Time 2). The surveys were designed to be completed by householders with the support of an interviewer - in this case the Energycarers.
Social surveys were completed with 17 households ahead of receiving the HES Homecare service (Time 1); 13 of this group also completed surveys after any intervention (Time 2). Fewer surveys than anticipated were returned for the evaluation. During interviews with the HES Homecare team, it was clarified that only participants deemed to be most willing or able to complete the surveys were asked to take part. The HES Homecare team discussed how some of the vulnerable people in the HES Homecare trial struggled to complete the survey:
"one was, 'On a sliding scale of one to five, how do you feel about this?' and people would kind of sit there and think about it and be like, 'I'm not really sure.' And also, like, they would then go off on a tangent and start speaking about something. So the surveys that were only designed to take half an hour ended up taking at least an hour"
Case Study 3 further highlights the vulnerability of some of the HES Homecare clients and the distressing effect that something like a survey might have on them. This means that the survey sample is unlikely to be representative of all of those that have received the HES Homecare service. Indeed, when analysing the surveys, it was noted that some of the responses were very positive (compared to typical responses observed in surveys of this type), this may be explained by which individuals took part in the exercise, the risk of interviewers' interpreting the meaning of an uncertain response to a survey question, and participants reflecting positive experiences of the pilot.
Further, the above quote highlights that the surveys could take longer than originally anticipated to complete. During interviews with the HES Homecare team, it was suggested that these took between 1 hour and 1.5 hours. A related challenge for the HES Homecare team was that some of the questions in the survey would be asked anyway through the HES interactions, this could be beneficial, but also problematic:
"…there's two ways of looking at that. One is that we were asking those questions anyway so it wasn't too much of an extra ask. And the other was, "Well we're already asking those questions and now you want us to ask them again in a slightly different context." So that's challenging. We tried to square the two"
The Homecare team commented on successive iterations of the draft survey and could restructure questions to suit the sample of people likely to be involved, although this was limited by the requirement for consistency with projects beyond this pilot that the survey was being used for. The questions were mostly adapted from similar surveys and had therefore been pre-tested with low income, elderly and vulnerable households.
Table 5: Time 1 and Time 2 social surveys according to intervention and control groups and region.
|Intervention: Homecare||Time 1||7||10||0||0||17|
|Control: Standard CLO||Time 1||1||0||5||1||7|
Seven households in receipt of the standard CLO service took part in Time 1 surveys, and 4 of these went on to complete Time 2 surveys (see Table 5). One of these 5 completed Time 2 survey as a HES Homecare client because they were initially offered CLO support and subsequently supported through HES Homecare after the pilot area was extended. In both the Homecare and CLO groups, there is a reduction in the number of participants between Time 1 and Time 2 due to natural attrition in a service of this nature, for example, people dropping out, being unable to continue with the programme for health reasons, and becoming uncontactable. The intervention samples were in the regions where the HES Homecare pilot was being trialled and the control group included participants in the Highlands and Orkney (see Table 5). The different groups surveys have some quite different characteristics in terms of age, tenure, household, and property type (see Appendix 2). It is recognised that these characteristics are determined by the individuals in receipt of the HES Homecare and Standard CLO services, but the differences in the sample make it difficult to draw direct comparisons between the intervention and control groups. This means that the objective of a matched sample of treatment and control households was not achieved, so it is not possible to conduct a systematic evaluation of the survey data. Instead the data presented in Section 4.3 are necessarily impressionistic and tentative.
3.3. In-depth interviews
Semi-structured interviews were conducted with those involved in the HES Homecare pilot. This included the HES Homecare coordinator and the two Energycarers delivering the service in South West and North East Scotland. The interviews were conducted in person or over the phone, and were between 45 minutes and 2 hours in length. The interviews discussed the processes of delivering the HES Homecare service and the successes and challenges of working on this pilot; a full interview schedule is included in Appendix 3. The interviews were audio recorded and transcribed verbatim.
3.4. Case studies & live learning document
The HES Homecare team compiled a series of case studies to capture specific details of the customer journeys that people went on through the service. These provide additional detail on the health and domestic circumstances of those targeted through the HES Homecare service, the types of recommendation that were made by the Energycarers, and the subsequent interventions that people received. A selection of case studies have been used in this evaluation to supplement the temperature monitoring, social survey, and interview data and build a fuller picture of the service. The individual case studies referred to in this report are included in Appendix 4. A live learning document was also maintained by the HES Homecare team, this sought to capture lessons from their experience of the pilot and was shared with the evaluation team for use in this report.
In the next section, data collected through each of these evaluation activities is treated together to explore the delivery and impacts of the HES Homecare pilot. The results are presented in relation to: delivering the service; changes to internal temperature; changes to comfort in the home; and wider impacts of the service