A process evaluation of the implementation of ASSIST in Scotland

Report on the ASSIST pilot programme, which promotes non-smoking in schools.


Chapter 2: Design and Methods

This chapter outlines the research aim, research questions, design, sample and analytical approach.

2.1 Aim and Scope of the Evaluation

In light of the existing evidence base (discussed in Section 1.2) this is a process (not an outcome) evaluation to examine the acceptability and implementation of ASSIST to inform potential roll out to other areas of Scotland. The 30-month evaluation was funded by the Chief Scientist's Office and the Scottish Government and commenced in August 2014. The overall aim of the evaluation was to assess implementation across the three pilot areas specifically looking at fidelity and acceptability and drawing out learning that could assist future implementation in other areas. Data were also collected from students regarding their smoking status, but these were not primary outcome measures for this process evaluation.

The study had the following research questions:

1. What are the barriers and facilitators to implementation in Scotland?

2. What refinements are required to implement the ASSIST programme in Scotland?

3. Are essential elements of the ASSIST model maintained during pilot implementation in Scotland?

4. How acceptable is the programme from a stakeholder perspective (for strategic leads, trainers, students and school staff)?

5. What changes in smoking-related knowledge, attitudes and behaviour are observed amongst students in the ASSIST Scotland pilot schools?

6. What are the delivery costs of the programme?

7. What lessons can be learned to inform any future roll out of ASSIST across Scotland?

8. Is there scope to expand the model and look at other risk taking behaviours in Scottish schools in the future, e.g. drugs, alcohol?

2.2 Research Design

This was a mixed methods study consisting of three elements: 1) evaluating the implementation planning process; 2) evaluating delivery in schools and; 3) an assessment of costs. A range of participants (school staff, students, peer supporters, site leads/co-ordinators, trainers, stakeholders from policy and commissioning) were consulted via in-depth interviews, paired interviews, mini focus groups, observation and a self-complete survey. To maximise available resources a two tier design was used. Tier one included consultation with school leads and a pre and post student survey in 20 schools from the three NHS Boards who took part in the pilot (Greater Glasgow and Clyde; Lothian; and Tayside). Tier two identified six case study schools (two from each pilot site, selected from the 20 tier one schools) where a researcher observed the entire cycle of ASSIST delivery, examining intervention fidelity and consulting with peer supporters and other students, via mini group discussion. Research methods to assess each element are summarised in Table 2. Where possible the student survey was administered by the research team via a special assembly for the whole year group. In two schools this was not possible and teaching staff administered the survey during class time (staff were invited to a face-to-face survey briefing by a member of the research team prior to delivery). The baseline survey was piloted before roll out and additional questions for the follow-up survey were tested with a group of S2 students.

Table 2: Summary of research design

Research method Stakeholder group
Strategic* School leads Trainers Peer supporters Students
Semi-structured/ in-depth interviews X X (pre & post) X (pre & post)
Pre & post survey X X
Observation of entire cycle of ASSIST X X
Paired interviews/mini groups X X

Desk based review of cost data

*Scottish Government, DECIPHer-ASSIST, NHS Boards, Education, Site Co-ordinators

2.3 Sample

The school sample was selected using non-probability sampling techniques. Our aim was to examine the acceptability and fidelity of intervention delivery, not effectiveness, thus a random (probability) sample was not required. In addition, the delivery model for ASSIST was phased over three school years which meant that there was uncertainty around the exact number of schools taking part and when they would receive ASSIST which could result in an incomplete sampling frame. In light of this, and in consultation with the Research Advisory Group, the following quota sampling criteria were agreed:

1. A minimum of five schools from each NHS Board

2. A maximum of five schools will be in a rural area

3. A maximum of five schools will be in less deprived areas

4. The six case study schools will be spread out evenly across the sites i.e. two in each NHS Board

5. A minimum of eight schools will have ASSIST delivered to S1 students

6. A minimum of eight schools will have ASSIST delivered to S2 students

This means that findings from the student survey are not directly comparable to the wider school population. Table 3 presents an overview of the 20 schools that participated and how they compared with sample criteria described above. All sample criteria, except one (a minimum of eight schools will have ASSIST delivered to S1 students - seven instead of eight) were met. There are two reasons this criterion was not met. First, all Glasgow schools were working with S2 students only and second, two Lothian schools previously identified as working with S1 students were changed to S2 (in one school the Head Teacher requested this change and in the other the S1 school roll was too small to be included). Scottish Government provided the Scottish Index of Multiple Deprivation (SIMD)[6] data which showed S1 and S2 pupil distribution by their home postcode. This was used to identify schools with a larger proportion of pupils from more deprived areas. Categorisation of urban/semi-rural/rural areas was based on fieldwork observations of the school and surrounding area.

Table 3: School survey sample information

School ID NHS Board Rural/semi rural or urban Deprived area Case study School year School year roll Baseline sample Follow-up sample Mode of delivery
N %* N %**
2 Site 3 Urban Yes No S2 79 60 75.9 37 46.8 In class by teacher
3 Site 3 Urban Yes No S2 159 143 89.9 118 74.2 Special Assembly
4 Site 3 Urban Yes Yes S2 225*** 151 67.1 137 60.8 In class by teacher
9 Site 3 Urban Yes No S2 134 114 85 106 79.1 Special Assembly
10 Site 3 Urban Yes No S2 191 173 90.5 168 87.9 Special Assembly
17 Site 3 Urban Yes Yes S2 167 140 83.8 127 76 Special Assembly
1 Site 1 Semi-rural No Yes S1 107 90 84.1 84 78.5 Special Assembly
12 Site 1 Urban Yes Yes S2 138 125 90.5 97 70.2 Special Assembly
15 Site 1 Urban No No S2 138 132 95.6 121 87.6 Special Assembly
18 Site 1 Urban Yes No S2 57 44 77.1 40 70.1 Special Assembly
19 Site 1 Urban Yes No S2 58 41 70.6 37 63.7 Special Assembly
20 Site 1 Urban Yes No S1 121 107 88.4 63 52 Special Assembly
5 Site 2 Urban Yes Yes S1 176 160 90.9 149 84.6 Special Assembly (exam conditions)
6 Site 2 Urban Yes No S1 200 185 92.5 161 80.5 Special Assembly
8 Site 2 Semi-rural Yes No S1 188 169 89.8 52**** 27.6 Special Assembly
13 Site 2 Rural No No S1 110 90 81.8 89 80.9 Special Assembly
14 Site 2 Rural Yes No S2 147 123 83.6 109 74.1 Special Assembly
16 Site 2 Semi-rural No No S2 179 168 93.8 148 82.6 Special Assembly
21 Site 2 Semi-rural Yes Yes S1 204 180 88.2 167 81.8 Special Assembly
22 Site 2 Semi-rural Yes No S2 147 130 88.4 123 83.6 Special Assembly

* Percentage of eligible students (school year roll column) who completed a baseline survey.

** Percentage of eligible students (school year roll column) who completed a follow-up survey.

***The number of surveys completed at baseline in school 4 was low (n=151) in comparison with the number of eligible students which was 225 (school roll column). This is because the school was not able to accommodate a special assembly and we were reliant on class teachers administering the survey, some of which did not.

**** Several students were on a school trip the day the follow-up survey was conducted, which the school lead was not aware of. Unfortunately they were not able to facilitate self-completion for these students.

2.4 Participant Information

Across 20 schools, at baseline the number of students eligible to participate in the baseline survey was 2925, of which 2491 completed a questionnaire. At follow-up the number of eligible students was 2491, of which 2130 took part representing 15.6% lost to follow-up. As illustrated in Table 4 there was fairly even representation by gender and class at baseline, but fewer S1 students at follow-up (as noted in Table 3 school 8 had a very low response rate at follow-up).

Table 4: Survey respondents

Baseline Follow-up
N % N %
Gender
Boys 1250 50 1064 51
Girls 1247 50 1041 49
Total 2497* 100 2105 100
School Year
S1 1011 41 789 38
S2 1480 59 1311 62
Total 2491* 100 2100 100

*28 and 34 children did not provide information on their gender or school year, respectively

In addition to the quantitative sample, the following qualitative data were collected:

  • 41 interviews with 24 members of school staff (all bar one were face-to-face and included deputy head teachers, principal teachers, subject teachers, pupil care and support teachers);
  • 31 trainers who participated in a baseline interview or focus group;
  • 29 students who took part in six mini group discussions;
  • 15 stakeholder interviews with 17 participants (face to face and telephone) who held a strategic, planning, commissioning, delivery or policy role;
  • Structured observation of the delivery of an entire cycle of ASSIST in six schools.

2.5 Ethics and Informed Consent

The study was approved by the University of Stirling Ethics Committee on the 4th September 2014. To conduct research in schools we also had to apply for permission from each Local Authority (eight in total). Once this was granted, we were able to approach the schools directly to invite them to participate in the study. Informed consent was obtained from school leads and trainers. Parents were given written information about the study and an opportunity for their child to opt out of the research. Student consent was also obtained.

2.6 Analysis and Reporting Style

In light of the research study design (mixed method) data collected from each stakeholder group were analysed individually, but where possible, key findings are presented as a thematic analysis. As such, the report has been written in a mainly qualitative rather than quantitative style with tables and figures kept to a minimum in the results section (summary tables presented in the appendix which the reader can refer to).

2.6.1 Qualitative analysis

Analysis of qualitative interviews and mini group discussions was conducted using a structured thematic approach based on systematic coding of verbatim transcripts which was organised and managed via QSR Nvivo 11. Coding frames for each of the stakeholder groups were jointly developed, piloted and amended by members of the research team prior to full coding of relevant transcripts. Key themes arising from coded data for each stakeholder group were identified and reported alongside those of other groups highlighting both similarities and differences.

Case study observations were recorded using a structured observation form which recorded all four stages of ASSIST delivery (peer nomination, peer recruitment meeting, training days and follow-up sessions). Data from the completed observation forms were entered into a MSWord template which was populated with details from each stage of delivery. This enabled assessment of key measures of fidelity and also contributed to the thematic analysis (e.g. observation date regarding behaviour management). The template was piloted and refined by two researchers before being populated.

2.6.2 Quantitative analysis

Student survey data were entered into MSExcel. The Excel data were checked for data quality and consistency before being 'locked' for analysis. This locked dataset was archived as the denoted version used for this analysis, and then imported into Stata V14. The data structure contained three possible units of analysis, the geographical area, the school level, and the individual pupil level. The descriptive analysis for this study focused primarily on the individual pupil level, and almost all the variables were categorical. For those pupils who completed the surveys at baseline and follow-up the missing data was minimal with almost all categories greater than 95% completion. At follow-up some questions were only relevant to students who had spoken to a peer-supporter. Nuisance responses (e.g. fabricated responses) and unlikely outliers were sparse and were re-categorised to missing, or to the next nearest likely category following discussion with the research team. Tabulated data was produced for both the baseline and the follow-up data. Both baseline and follow-up data were then match-merged on unique anonymised Pupil ID number and a series of cross-tabulations produced.

Contact

Email:

Back to top