Connecting Scotland: phase 1 evaluation

This report evaluates the impact of the Connecting Scotland programme for users in the first phase of delivery.


Evaluation approach

To understand how, and to what extent, the Connecting Scotland programme is having an impact on those receiving support, evaluative research is being carried out throughout the life cycle of the programme, led by Scottish Government user researchers. The evaluation chiefly focuses on collecting and analysing data on the experiences of those who have received equipment and support from Connecting Scotland. The main data collection methods are:

  • analysis of application datai.e. the information included on application forms for each of the Connecting Scotland phases that explains the needs of the clients being referred to Connecting Scotland
  • a 'welcome / baseline Survey' – to learn about the user and understand their situation and their needs when they are enrolled
  • follow-up interviews – to understand, in detail, users' experience of Connecting Scotland
  • impact survey – around 9 months after users' involvement asking what has changed for them by being digitally connected.
  • telephone top-up survey – an added measure being piloted for phase 1 to validate the findings of the impact survey by taking an additional sample using a different channel.

Where are we in the evaluation?

Table 1 Overview of research and evaluation activities in Connecting Scotland

Phase

Activity

Status

Response rate

Phase 1:

8,061 users

+ 500 pilot[2]

Welcome survey

Complete

13.5%

Interviews

Complete

n=37

Impact survey

Complete

5.7%

Telephone top-up

Complete

n=57

Phase 2:

23,481 users

Welcome survey

Complete

5%

Interviews

Complete – undergoing analysis

n=40

Impact survey

Complete

4.8%

The most comprehensive user data collected so far pertains to phase 1 users, for whom a welcome survey, impact survey and follow-up qualitative interviews have been conducted, and is the focus of this report. Research activities for phase 2 are still ongoing and will be the subject of the next evaluative report.

Methods

Analysis of application data

Applications for devices from organisations, on behalf of their clients, contain information about people's digital support needs and the barriers they face in getting online.

Application data was analyzed and coded with the aid of Nvivo software to help identify the key themes relating to users' experience of digital exclusion and their support needs.

Analysis of application data helps to show that Connecting Scotland is addressing barriers to digital inclusion and targeting support to people who need it.

Surveys

People getting support through Connecting Scotland were invited to complete the welcome survey shortly after receiving their devices. Surveys are run on an online platform called 'Questback'. Mindful that people being supported by Connecting Scotland are still developing their digital skills, digital champions were enlisted to help participants complete online surveys. Participants who had registered an e-mail address with Connecting Scotland when they received their devices were sent an e-mail inviting them to complete the survey by following a link.

Users are invited to complete a second survey – the 'experience and impact survey' - after owning their devices for around 9 months, although the different pace of device distribution among applicant organisations meant that some users had had their device for slightly more, or less time at the time of the survey. For the impact survey, text messages with the survey link were sent to those who did not have an e-mail address but did have a mobile number. Those with only a landline number were called to ask if they wanted to complete a shorter survey on the telephone.

The surveys allow us to quantitatively measure aspects of users' experience as a whole. This gives an overview of how phase 1 of Connecting Scotland has been perceived and the impact it is having on users.

Responses to survey questions were analysed and interpreted in relation to other responses, both within and between the two surveys. This allows us to gauge impact, and potentially provides some context for better understanding responses to certain questions. For example, comparing data between the welcome and impact surveys indicates that users' confidence has increased over time. We can also, arguably, better understand why, for instance, the use of some online services was not reported more widely when we account for people's concerns around online safety as indicated by responses to other questions; people might be reluctant to use services where entering personal information is required.

Comparability between survey data

There are features of how the research was set up that constrain the ability to make direct comparisons between the phase 1 welcome survey and the phase 1 impact survey. Chief amongst these is that the surveys are anonymous and so we cannot work out if the samples for the welcome survey and impact survey overlap or are distinct[3]. However, we can take a measure of confidence from the similarities in the demographic profile of respondents between these surveys, which suggests the underlying samples may be similar. In each, around a third of respondents report being retired (P1-Welcome Survey - 33%, P1-Impact Survey - 30%), and around a quarter being disabled (P1-Welcome Survey 26%, P1-Impact Survey 22%). There were more female respondents (62%) than male to the impact survey. The Phase 1 welcome survey employed different age ranges to the impact survey so direct comparison of age is not possible, but this too looks to have been broadly maintained. Additional interviews have been conducted over the telephone to help redress any biases in the data.

Qualitative interviews

Around 40 semi-structured interviews were conducted with phase one end-users over the telephone. Respondents were recruited through organisations who had received support from Connecting Scotland.

Interviewees were sent a form explaining the reason for the interview and informing them of their rights as participants, including assurances of anonymity in any reporting of the interviews. Participation was completely voluntary and interviewees could withdraw at any time.

As well as the interviewer and interviewee, a note-taker was on each call to make a detailed record of the conversation, including capturing direct quotations where possible. These notes were anonymised and assigned individual codes to enable analysis.

In these interviews, respondents were asked questions about their thoughts, perceptions and experiences of the Connecting Scotland programme. The semi-structured interview format meant that respondents could be asked more in-depth questions about aspects of the service that were important to them, and could explain why, or how, they were using their devices and the impact that this had.

Interview notes were read, manually coded and cross referenced with other responses to identify common themes throughout the sample.

Qualitative interviewing enables us to find out about aspects of people's experience that might not have been considered in the design of the programme, or when planning research activities. The data from interviews can also, in some cases, add to our understanding of the survey data; sometimes interviewees explain in greater depth the reasons for how they are using their device or accessing support. The data from qualitative interviews can also help us to strenghten certain hypotheses about the impact of the programme, for example, when people describe the effect that their ability to connect with others has had on their mental health.

Top-up survey

To help validate the findings of the impact survey that was delivered online, we impemented a telephone top-up survey to provide an alternative channel for participants to engage with the research. The main aim was to mitigate some of the potential sources of bias that may be associated with a digital survey that we outlined in the previous section.

Delivering a survey over the telephone is more time consuming than completing one online. We shortened the survey to ensure that it remained comfortable to answer over the telephone. In the top-up survey, we prioritised questions that related to the key metrics for phase 1 of connecting Scotland (being able to cope with the Covid response), and demographic questions so that we could understand any differences between samples. We piloted the survey and discovered that the 5 point Likert questions proved to be cumbersome to answer on the telephone, so these were shorted to a 3 point scale. The survey took on average 16 minutes to complete.

Sample selection

Our telephone sample was selected according to the following criteria:

1. participants living in Local Authority areas that had a poor response rate to the online survey compared to the number of devices issues to that region

2. participants who lacked an e-mail address and so would not have recieved a personal invitation to take part in the research

3. participants who had a landline number, which we took as a proxy of their being less digitally literate and less likely to have completed a survey online (after calling all the landline numbers in the sample, we called mobile numbers that met criteria 1 and 2 above)

Of 421 people contacted, 57 completed the telephone interview; a response rate of 14%.

Contact

Email: CSresearch@gov.scot

Back to top