Connecting Scotland: phase 2 evaluation

Report based on research with people receiving digital devices and support in phase 2 of the Connecting Scotland programme. It discusses people's experiences of the programme and the impact that it has had on them.


Evaluation Approach

To understand the impact that Connecting Scotland is having for users, a programme of evaluative research has been undertaken by Scottish Government researchers. We wanted to learn about; users' needs, the barriers people face in using digital devices and the internet, the purposes for which people are using their new devices and connection, and any changes or improvements that have occurred for people since receiving support from Connecting Scotland.

A range of data have been drawn upon as part of this evaluative research. The table below gives an overview of the methods and resources that have provided evidence about Connecting Scotland users in phase 2.

Fig 2. Data used in evaluation of Connecting Scotland Phase 2
Data Source Description Purpose
Applications to Phase 2 (including Winter Support) 1,373 applications from organisations on behalf of people with whom they work.
A sample of around 50% was coded and analysed for themes.
Reviewing information from applications helps identify the needs of particular groups of users, as well as providing insight into the barriers people face to digital inclusion.
Welcome Survey Online survey administered shortly after users have received their devices. Response rate = 5% Provides a broad understanding of people's online engagement and digital skill level, as well as indicating how people might hope to benefit from Connecting Scotland support.
Impact Survey Online survey administered 6-9 months from receipt of device. Response rate = 4.8% Indicates the type and range of activities for which people have been using devices, and the extent of engagement with support (i.e. digital champions). Provides information about the kinds of change brought about from engagement with the programme.
Telephone 'top-up' survey A shortened version of the impact survey conducted over the phone with users for whom email addresses were not available. N=39 A measure to balance the sample of respondents by including people who may not have had the ability, or inclination, to complete the survey online.
Provides a check for possible bias in online sample.
Qualitative Interviews Semi-structured interviews conducted over the phone with a sub-sample of users. Took place after welcome survey, but before impact survey. N=39* Allows for more in-depth responses about people's experience of the programme and can provide insights that may not have been previously considered.

*This figure includes 3 written responses to interview questions. 1 interview was conducted with 3 participants

Analysis

Each data source has been analysed by the research team to identify the central themes and issues for phase 2 users. Using different methods of data collection allows a nuanced analysis in which evidence from each source can be compared and synthesised, so that themes can be interrogated and developed. For example, the information given in qualitative interviews can help to understand some survey responses in greater depth. If, for instance, surveys indicate an improvement in people's mental health, interview data might explain which particular aspects of involvement in the programme have led to this improvement. The methods for analysing each data source are briefly outlined below.

Surveys

Both the online welcome and impact surveys are hosted on Questback. This platform includes reporting functionality that shows the pattern of responses for each question. Results can also be filtered to show a sub-section of responses.

The surveys provide an aggregate, quantitative measure of people's experience of the programme as well as detailing the characteristics of users, such as their household composition. From the survey results, we can gauge the extent to which the overarching aims of the programme have been realised. Comparing the welcome survey with the impact survey enables analysis of the extent to which things have changed for users since receiving support. Comparing responses across questions in the same survey can allow for broader inferences to be made about users' experience. For example, in the phase 2 impact survey, a majority of respondents described themselves as confident internet users, which may help explain why the majority also said that developing their digital skills was 'self-taught'.

The telephone top-up survey was designed to check that the main impact survey was representative of users' experience. Significant differences in response patterns and/or demographic features of participants, would indicate that the online survey might be subject to some degree of bias.

Application data

Approximately half of all applications to phase 2 were reviewed, and their content coded for themes with the assistance of Nvivo software. The themes show commonalities between applications made on behalf of certain groups. From analysing the application data in this way, the needs of target groups, and barriers to their digital participation, can be consistently identified. This helps in understanding the ways in which support from Connecting Scotland might benefit particular groups of people.

Qualitative Interviews

Participants were recruited via organisations that had applied for devices. The sample of organisations contacted was designed to capture a diverse group of users, varying by age, geography and representing the 3 primary target cohorts: families with young children, older and/or disabled people and young care leavers.

Semi structured interviews were conducted over the telephone between August and November 2021. Semi-structured interviewing means that a consistent topic guide is used, but that interviewers have scope to ask additional, or follow-up, questions which provide more detail about users' experiences. As well as the interviewer and respondent, a note-taker joined each call to provide a detailed record of each conversation.

Interview notes were anonymized and distributed among the research team and to staff members from SCVO (the delivery partner). Each person reviewed a small sample of interviews and were asked to summarize the stand-out topics and issues from each. A deliberate overlap was designed into the distribution of notes so that each interview was analysed by more than one person. This provided interpretive balance, and aimed to maximise the breadth of possible insights.

A group coding exercise was then undertaken in which notes were shared and discussed so that topics common to the whole sample could be grouped together into overarching themes. A coding framework was then drawn up, based on the results of this exercise, linking individual insights to wider themes and showing which interview sources exemplified these insights.

Comparability between welcome and impact surveys

Survey responses are anonymous which means that we cannot know whether the people responding to the welcome survey have also responded to the later impact survey.

Therefore we cannot say with certainty that changes and effects evidenced in the impact survey are applicable to the phase 2 cohort as a whole. We can, however, compare the demographic profile of respondents to see if there is congruence between the type of people responding to each survey. Similarities in the characteristics of respondents provides a reasonable basis for comparison where the aim is to identify broad patterns of change.

In terms of age and household composition (number of people, and number of children in household) the survey samples are broadly similar, although marginally more people in the impact survey reported having no children living in the household than in the welcome survey (26% compared to 19%).

In the impact survey, we asked directly whether respondents were young care leavers which means we can look at the pattern of response for this group in isolation. However, the same question was not asked in the welcome survey, so we cannot analyse care leaver's responses.

Both surveys show a wide geographical spread although the proportions from each local authority vary between the surveys. In both surveys, Glasgow represents the single largest location (which is expected), though there is notable variation in the response rates for some local authorities. For example, Edinburgh represents just over 5% of the response rate in the welcome survey, but increases to 8% in the impact survey. Conversely, the response from West Lothian is almost halved between the welcome and impact surveys.

There are broad similarities between people's self-reported working situation, too, with comparable numbers of retirees, people not working due to disability and people in education. The most notable difference is between rates of employment, with more people in the impact survey reporting being employed, or self-employed (+7%), with an almost exactly corresponding decrease in the number unemployed. An optimistic interpretation would be that many of the same respondents have participated in both surveys, a significant proportion of whom have found employment since being involved with Connecting Scotland. While this may not be the case, it highlights that variance in responses between the surveys can be a reflection of changes in people's personal circumstances, as well as of changes in the sample of respondents.

While there is evidence of some variation between the samples, they broadly represent a similar profile of respondents. The overall response rate to both of the surveys was also similar. Even without grounds for direct comparability, both surveys are valid data sources, in their own right.

Contact

Email: csresearch@gov.scot

Back to top