Scotland's Climate Assembly - process, impact and assembly member experience: research report

Mixed methods research into Scotland’s Climate Assembly, including process, impact and assembly member experience.


Appendix 1 - Methodology

The research team

The main team comprised Scottish Government social researchers: Dr Nadine Andrews (team leader), Gemma Sandie and Scott McVean. This team worked in collaboration with Dr Stephen Elstub from Newcastle University and also with Alexa Green from Edinburgh University/Scotland’s Rural College, who provided their services for free. For early part of the research programme, strategic support was provided by Scottish Government official Nick Bland, and research support by Dr Evelyn Bower. Further research support was also provided by PhD interns Hannah Gracher and Julia Chan.

Deltapoll was commissioned to conduct a population survey in Scotland and analyse the data.

Blake Stevenson was commissioned to provide analytical support. In this report, their analysis of the non-participant observational notes and transcripts of small group discussions have been drawn upon.

A Research Advisory Group was formed in January 2021. Members external to Scottish Government: Pamela Candea (Scottish Communities Climate Action Network), Stuart Capstick (CAST), Chris Shaw (Climate Outreach), Graham Smith (University of Westminster) and Rebecca Willis (Lancaster University).

Research questions

1. How do the following factors affect the quality of the Assembly?

  • organisation
  • quality of facilitation
  • evidence (what & how presented)
  • member participation
  • quality of deliberation
  • members’ views of Assembly

2. To what extent do members feel ownership of the goals & recommendations in the Assembly report?

3. How do members’ views on climate change and how to tackle it, before and after the Assembly, compare with the wider population of Scotland?

4. What do people in Scotland think of the Assembly and its recommendations?

  • awareness & understanding
  • views

5. What impact has the Assembly had on climate change debate and policy in Scotland?

6. a) How has climate change been explained to members?

b) To what extent, and how, has evidence been used in members' deliberations?

c) What are the implications of how climate change has been explained and how evidence has been used for the members' recommendations?

7. What are the outcomes for members of taking part in the Assembly?

8. What do members think of the Scottish Government response to the Assembly report?

Member survey

An online survey was completed by Assembly members prior to the Assembly starting and after each weekend, comprising mainly closed questions with a few open questions. Table A shows the response rate for each survey. Quantitative data was cleaned, anonymised and analysed in Excel and SPSS. Qualitative data was analysed thematically in NVivo. The results were combined with non-participant observation notes to produce data briefings of each weekend for the Secretariat and Design Team[140].

The demographic and climate attitude profile of respondents by weekend, alongside the profile of all members is included in the data table published with this report.

Limitations: as not all members completed the survey, the results presented in this report should be regarded as indicative only of the views of all Assembly members.

Table A. Member survey response rate
Number of survey respondents Number of members attending all or part of Weekend response rate as % of total attendees
pre-Assembly survey 68 102 recruited at this time 67%
Weekend 1 64 99 65%
Weekend 2 65 102 64%
Weekend 3 61 100 61%
Weekend 4 63 101 62%
Weekend 5 70 101 69%
Weekend 6 67 102 66%
Weekend 7 69 100 69%
Weekend 8 51 73 70%

Table B shows the response rate by topic streams for the relevant weekends.

Table B. Member survey response rate by topic stream
Diet, Lifestyle & Land Use Homes & Communities Work & Travel Total respondents
Weekend 3 18 30% 22 36% 21 34% 61
Weekend 4 21 33% 23 37% 19 30% 63
Weekend 5 22 31% 22 31% 26 37% 70
Weekend 6 20 30% 20 30% 27 40% 67
Weekend 7 21 30% 25 36% 23 33% 69

Non-participant observation: each Assembly weekend (apart from Weekend 1) three members of the Research Team observed the whole weekend including plenary sessions, evidence presentations and small group sessions as non-participants. For Weekends 3, 4 and 5 when the members were split into three topic streams, the researchers also divided into the three streams. Researchers took notes of 56 sessions, which were analysed thematically by Blake Stevenson in NVivo. 978 pages of notes were analysed, with a total word count of 430,242. The research team meeting minutes and data briefings were also analysed for observational content.

Limitations: There were 15 small groups of whom 2 or 3 it was not possible to observe due to lack of consent. The researchers each observed 3 – 4 different groups each weekend. Not all groups in every session were observed.

Qualitative semi-structured interviews and a qualitative survey: with 18 people involved in organising and delivering the Assembly (members of the Secretariat, the Design Team, the Stewarding Group and Evidence Group) were conducted in May-June 2021. The interviews, which were mostly around an hour long, were audio recorded, transcribed with intelligent verbatim style and anonymised. The transcripts analysed thematically in NVivo.

Limitations: for research capacity reasons, six Organising group members completed a qualitative survey rather than took part in an interview. As the interviews were semi-structured, follow up questions could be asked, but this was not possible with the survey. Due to time and resource limitations, interviews with more Stewarding Group members were not conducted.

Small group discussions: 48 small group discussion sessions across all the main Assembly weekends (WE1 to WE7) were audio recorded and transcribed. Transcripts from a sample of 9 sessions were analysed in Excel for quality of deliberation. Sessions were selected across weekends and with a range of different small group facilitators. The sample covered both mitigation and adaptation topics, and included the three topic streams as well as mixed stream groups.

A total of 1490 contributions (referred to as speech acts) by Assembly members were analysed. The content analysis framework is based on the Discourse Analysis Index[141]:

  • relevance of speech act to the topic under discussion.
  • presence or absence of a 'demand' in speech act (a 'demand' is an idea, suggestion or proposal about what should be done).
  • whether a demand was accompanied by a justification, and if so whether the justification was 'inferior' or 'qualified' ('inferior' is where there is no explicit link between demand and justification; 'qualified' is where there is such a link).
  • whether the justification was oriented towards personal interest, group interest, marginalised group interest, or general interests (also referred to as 'common good').
  • whether facilitator requests justification of demand from Assembly member.

The gender of the Assembly members was recorded in the transcript (based on voice), which enabled analysis by gender.

Evidence presentations: Scotland’s Climate Assembly website provides transcripts of almost all the evidence presentations. A sample of 63 presentations were selected for analysis out of a total of 102 presentations that directly related to climate change. A stratified random sampling method was used to ensure a mix across weekends, topic stream, presenter gender and presenter type (Evidence Group, informant or advocate). The transcripts of these presentations were analysed thematically in NVivo.

Limitations: the sample of presentations analysed is close but not exactly representative of all presentations by weekend, topic stream and presenter type.

Assembly report: the Statement of Ambition, goals, and recommendations with their supporting statements were analysed thematically in NVivo.

Media Analysis: 151 media articles were analysed across 16 months, from November 2020 until 11th of February 2022. Data was collected using the following data sources: NewsBank, NewsLookUp, NewsNow, Google UK News and Factiva News database. The Assembly website which uploads a range of media relating to the Assembly was also used. The following search terms were used:

  • Scots OR Scottish OR Scotland OR Scotlands OR Scotland's AND "Citizens Assembly" OR "Citizens' Assembly" OR "Citizen’s Assembly"
  • Scots OR Scottish OR Scotland OR Scotlands OR Scotland's AND "Climate Assembly

When collecting data, the following criteria was used:

  • the media must be within the aforementioned date range.
  • it should include online news media only (excluding blogs, social media posts and print – see Table C below for a list of what media were included).
  • all mentions of Scotland’s Climate Assembly spanning any geographic area (Local, National (Scotland and UK), International) should be included.

Table C. List of media outlets analysed

Overarching type of media

Newspaper (31 outlets, accounting for 121 articles)

Sub type of media and explanation

Traditional (tabloid or broadsheet) - Newspapers with a print equivalent (22 outlets, accounting for 106 articles)

Outlets

The Herald; The Scotsman; The National; The Telegraph; Financial Times; The Guardian; The Scottish Farmer; The Express; Edinburgh Evening News; The Press and Journal; The Scottish Sun; The Daily Record; The Sunday Post; Edinburgh Reporter; Independent; The Times; Plainsmen Post; Morning Star; The Courier; Express & Star; The Northern Echo; The Daily Mail

Overarching type of media

Newspaper (31 outlets, accounting for 121 articles)

Sub type of media and explanation

Online only - News sources which do not have a print equivalent (9 outlets, accounting for 15 articles)

Outlets

BBC; Scottish Construction Now; Scottish Housing News; Third Force News (TFN); STV; FE News; Environment Journal; Resilience; Open Democracy

Overarching type of media

Magazine (5 outlets, accounting for 10 articles)

Sub type of media and explanation

An online magazine which may or may not have a print equivalent, which relates to a particular topic, readership or location

Outlets

Holyrood; Bella Caledonia; Greater Govanhill; The Good Men Project; Farmers Weekly

Overarching type of media

Other (16 outlets, accounting for 20 articles)

Sub type of media and explanation

Where the source provides news, but does not fit into one of the above categories

Outlets

Scottish Government; University of Edinburgh; University of Aberdeen; Scottish Land and Estates; Scottish Parliament Informational Centre (SPICe); Buergerrat; Circular Communities Scotland; The City of Edinburgh Council; Common Weal; The Scottish Greens; Knowledge Network on Climate Assemblies (KNOCA); Soil Association; Scottish Communities Climate Action Network (SCCAN); Airport Watch; Royal College of Physicians and Surgeons; Scottish Wildlife Trust

The articles were analysed thematically in NVivo. A quality assurance process was conducted, following Braun and Clarke (2006) six-step approach to coding and thematic development[142]. The data was quantitatively analysed in Excel.

Deltapoll population survey: Deltapoll used both online and telephone (CATI) surveys. The online surveys were completed by a representative sample of 1667 adults (aged 16+) in Scotland with internet access. This was premised on a representative base sample of 1,250 Scottish adults aged 16+, boosted with 200 online surveys with Scottish adults living in rural locations, and a further 200 with those aged 16-24 years of age. This was to ensure that the views of these harder-to-reach samples were fully represented. The online survey was supplemented by a telephone survey with 250 non-internet users. The total fused sample was 1917. The survey was completed 29 July – 14 August 2021.

Recruitment

For both the online and telephone surveys, an active sampling technique was used to draw a targeted sample from a panel of registered target respondents. For the online surveys, Deltapoll worked with Dynata who have a panel of over 750,000 adults in the UK including 75,000 in Scotland. Panellists were placed into specific groupings based on a combination of factors including age, gender and region. Potential participants were selected using random start, fixed interval techniques to generate enough invites (combined with expected response rates) to meet the desired sample size. Respondents were invited via an email invitation. Typically, around 50% of the panel members invited to a given survey take part. Online participants received points for taking the survey which could be converted into a financial incentive.

The telephone survey used random digit dial techniques to seek Scots who don’t access the Internet. No incentives were used for the telephone interviews. Profiling data on these eligible targets was limited, but indicated that just 7% of Scots fulfilled this criteria, and that nearly all of them were over the age of 55 years. In the event, all telephone interviews were indeed with Scots aged over 55.

Quotas and weighting

Under the quasi-random quota sampling method, Deltapoll used a two-stage process to ensure a representative sample. The first involved setting quotas, in this case by age, gender, ethnicity and region. The second was data weighting, which corrects for any quotas being under or over-achieved during fieldwork. Table D shows the target and achieved percentages by selection criteria for online and telephone surveys.

Analysis

The data was analysed in SPSS and weighted to Census 2011 data. The data is correct to within +/-2.2% at the 95% confidence interval.

Table D. Profile of Deltapoll population survey sample
Online N = 1,667 Telephone N = 250
Category Variable % target online N achieved online % achieved online % target phone N achieved phone % achieved phone
Gender Male 48% 785 47% 48% 117 47%
Female 52% 859 52% 52% 132 53%
Age 16-24 15% 394 24% 4% 0 0%
25-34 15% 206 12% 4% 0 0%
35-44 17% 236 14% 4% 0 0%
45-54 18% 260 16% 4% 0 0%
55+ 35% 571 34% 96% 245 100%
Region Central Scotland 12% 262 16% 12% 44 18%
West Scotland 13% 235 14% 13% 32 13%
South Scotland 13% 160 10% 13% 33 13%
Glasgow 13% 240 14% 13% 31 12%
North East Scotland 14% 260 16% 14% 29 12%
Mid Scotland and Fife 12% 173 10% 12% 29 12%
Highlands and Islands 8% 102 6% 8% 21 8%
Lothian 14% 235 14% 14% 31 12%
Location setting Urban setting in main Scottish city 35% 380 23% - 30 12%
Suburban setting 36% 284 17% - 31 13%
Smaller city or sizeable town 9% 245 15% - 24 10%
Town, fringe & rural 29% 735 45% - 159 65%
Ethnic group White Scottish/British 84% 1454 89% - 246 100%
White, other 12% 62 4% - - *%
Asian, Asian Scottish/British 3% 52 3% - - *%
Black/Black Scottish/Black other 1% 30 2% - - *%
Mixed or multiple groups *% 30 2% - - *%
Other *% 14 1% - 1 *%
Income Less than £1800 - 443 31% - 105 77%
Between £1800 and £3000 - 544 38% - 24 18%
Between £3000 and £5200 - 327 23% - 7 5%
More than £5200 - 105 7% - 0 0%
Health condition Long term condition 18% 406 25% - 68 29%
No long term condition 82% 1202 75% - 170 71%

Implicit Response Testing and Emotional Resonance Score

Reaction testing has been used in psychology research for over 40 years. Implicit Response Testing (IRT) is an online method that measures the speed with which individuals respond to a stimulus - in this study, research participants were asked to agree or disagree with particular statements.

Drawing on neuroscience and cognitive psychology[143], specifically the research on System 1 (implicit/subconscious – fast) and System 2 (explicit/conscious - slow) decision making routes[144], IRT is a neuromarketing tool increasingly used in market research to gain insight into people’s gut instincts or subconscious responses. Faster speeds of response imply greater emotional certainty, delays imply lack of understanding or disbelief in answers given.

IRT is used to minimise potential confounds common in quantitative survey research such as people giving an answer when they don’t know what they think, or saying what they think is the ‘right’ answer, which lead to inaccuracies in predicting behaviour.

Several factors are considered in using IRT to limit other factors affecting response time notably length of statement and ease of understanding. In order for IRT response to be as efficient as possible, Deltapoll equalised the length of statements offered to respondents as far as possible in order to ensure that reading times did not themselves introduce unintended skews in response. Statements were also written to be as concise as possible while delivering unambiguous meaning. All statements were required to fit in the i-code 80-character limit for any one statement. Deltapoll also took respondents to a different platform to complete the IRT questions, which may help to focus the respondents’ minds. As a general rule, outliers are removed from the data.

Deltapoll measured delays in response in micro-seconds compared to benchmarks – the demographics questions asked at the start of the survey. The Implicit Response Testing was conducted on the i-code software platform created by Neuhome. Deltapoll then used an algorithm that takes speed of response with incidence of viewpoint (the proportion of respondents who agree or disagree with a statement), producing an Emotional Resonance Score (ERS) benchmarked out of 100.

Table E. Speed of Response scores

Scale: Level Speed Score 0 -9

1 - 3

Weak / Slow

4 - 6

Medium speed of response

7 - 9

Strong / fast

Table F. Emotional Resonance Scores

Scale: Emotional Resonance Score / 100 Potential for Persuasion

0-19

Likely to completely fail to connect

20-29, 30-39, 40-49, 50-59

Medium emotional connection

60-69, 70-79, 80+

Strong emotional connection

A maximum ERS score implies that ‘everyone’ believes/agrees with the statement and do so with complete emotional certainty; a score of zero implies nobody believes the statement and that no emotional belief in it exists. Medium scores indicate doubt or uncertainty about the truth of the statement, and that the belief is not fully internalised. According to Self-Determination Theory, the more internalised the motivation, the more autonomous a person will feel enacting the behaviour.

The higher the ERS, the greater the chance that the public will be receptive to being persuaded by that idea or message.

Limitations

The online and telephone questionnaires followed similar scripts, with technical adaptations and some scripting changes for the telephone version. In addition, some questions were unsuitable for telephone scripts and were not included. For example, the online questionnaire listed all 81 recommendations – this question was deleted for the telephone survey.

The telephone sample of non-internet users reflected an older demographic and comparisons between online and phone samples should not be made. However, the surveys do fuse well into a fully representative Scottish population-level sample.

Online methods depend on panellists signing up to receive surveys, and this might mean that such people differ from non-signed up people in the way they approach subjects, the way they behave or the answers they give.

While all efforts were made to design and draw a representative sample, geo-demographic profiles have been based on available Census 2011 and other official statistics. Profiling data might not exactly match that of people living in Scotland today.

Thematic analysis of qualitative data

The following process was followed:

  • several close readings of the text.
  • coding framework developed from initial analysis, with focus on themes relevant to research questions and operational definitions included.
  • data inputted to NVivo and coded following the coding framework, with codes revised or new codes created as necessary.
  • coding framework updated to match NVivo coding, this framework was used with all data sources to ensure consistency.
  • coding checked in a minimum of two quality assurance review processes by two separate researchers.
  • write-up of findings by theme.
  • check of write-up against NVivo codes to ensure no important omissions.

Secondary data

BEIS/Defra Climate Change and Net Zero: Public Awareness and Perceptions.

Online population survey conducted September - October 2020

Base: 6,947

British Public Perceptions of Climate Risk, Adaptation Options and Resilience (RESiL RISK). Online survey conducted October 2019

Base: 1401 adults.

Climate Assembly UK population survey (Wave 3)

Conducted online 14 September 2020

Base: 1,671 adults in the UK. Scotland sample: 149.

Climate Assembly UK population survey (Wave 4)

Conducted online 14 September 2021

Base: Scotland sample: 165

Citizens Assembly of Scotland population survey

Conducted online 11 - 22 March 2021

Base: 1,539 adults in Scotland

Ipsos MORI Scotland 2020. Research into public attitudes to climate change policy and a green recovery.

Telephone survey conducted October – December 2020

Base: 1045 adults in Scotland

One Pulse poll, commissioned by The Scotsman.

Online survey via the One Pulse app

Conduced across three waves in October 2020

Base: c.300 respondents per wave.

Scottish Household Survey 2019.

Conducted face-to-face survey

Base: 10,577 householders in Scotland.

Scottish Household Survey 2020.

Telephone survey, piloted field work in October 2020, main stage fieldwork January - April 2021

Base: 3000 households in Scotland

YouGov 2020. Mental health impact of climate change.

Online survey conducted 19-28 February 2020. Commissioned by the British Association for Counselling and Psychotherapy (BACP).

Base: 5527 adults in the UK (aged 16+)

YouGov 2021. Environment Tracker.

Survey conducted 27-29th August 2021

Base: 1667 adults in GB.

Contact

Email: socialresearch@gov.scot

Back to top