Climate change - public engagement: survey results 2022

Results of a representative survey of the Scottish public, focused on attitudes and engagement with climate change. The results will act as a baseline for the Public Engagement Strategy for Climate Change.


Appendix 2: Extended methodology

Mode of data collection

Data was collected using an online panel survey. An online panel is defined as an online group of recruited people willing to conduct social and market research surveys in return for a small financial incentive for each survey completed. The survey took approximately 10-15 minutes to complete.

A 'panel blend' approach was taken to fieldwork. This involves blending the sample across a number of research panels to increase the potential overall sample size, help ensure good coverage across key demographic groups, and to reduce the risk of panel biases that can sometimes occur by relying solely on a single panel provider. The panels used for this project were:

  • Panelbase
  • Made in Surveys
  • Lucid
  • YouthSight – youth specialist, used to reach those aged 16 and 17

The survey was created using Confirmit software, a bespoke survey software used throughout the industry. It is a device-responsive survey platform that automatically detects the device being used and optimises the look and feel for that device using predetermined layouts, including optimisation for phone and tablet completions. The software also allowed participants to pause completion at any time and finish the survey at a time more convenient for them.

Cognitive testing

Cognitive interviewing is a widely used pre-testing tool, in which respondents are asked to report directly on the internal cognitive processes employed to answer survey questions. Interviewers probe the meaning of specific terms or the intent of specific questions throughout the interview. A small number of purposively chosen respondents are interviewed and the results are not generalisable to a larger population.

Key questions were tested in 8 verbal cognitive interviews. Testing focussed on areas in the questionnaire identified in the initial desk appraisal phase as warranting further exploration to help ensure framing, wording and structure was correct.

Interviews were conducted via telephone or video call in mid-March. Cognitive interview participants completed the interviews via telephone or video call and were emailed a copy of the questionnaire prior to the interview. Interviews were conducted with a broad demographic and regional mix of and followed a verbal probe approach using a semi-concurrent probing technique. Many probes were tailored to be question specific, but typical probes included:

  • How did you find answering this question?
  • Can you tell me in your own words what the question was asking?
  • How easy or difficult did you find this question to answer?
  • What did [insert question or response term] mean to you?

The changes recommended were mostly nuances to question wording to enable greater audience comprehension.

Sampling and fieldwork

The final questionnaire was scripted and then tested before being signed-off. Fieldwork took place between the 21st March and 3rd April 2022. If conducting waves in future years, it is advisable that the survey is fielded during a time period so as to ensure seasonal effects are not behind shifts in attitudes and behaviors.

Representative quotas were set on age by gender, region and ethnicity. An additional 20% of the original representative value was added to each quota cell to ensure a degree of flexibility. A total of 1,782 respondents completed the survey. The final sample included:

  • 1,502 respondents aged 18+
  • 280 respondents aged 14-17

The early survey completes were extracted and reviewed to 'sense-check' the data. These checks included ensuring that the number of valid responses were being correctly recorded and checking the survey logic and routing was working as intended.

There were two main versions of the survey, one for adults aged 16 and above and one for young people aged 14 and 15. Young people below the age of 16 were recruited via their parents/guardians, with consent taken before the survey was completed. Once the initial adult component was completed, and the adult answered some basic demographic questions about their child, the survey was handed over to the young person to complete the main body of the questionnaire.

Given the youth component was targeting a relatively small part of the Scottish population, we employed two variants of the young person survey maximise the available sample as far as possible. One version saw the parent/guardian handover to their 14- or 15-year-old near the beginning of the survey after completing some initial demographic information (without completing the substantive questions themselves).

In the other version, the handover process came after the parent/guardian had they themselves completed the survey (where their earlier responses identified they were responsible for a child aged 14 to 15 in the household). As was the case for the entire sample, IP based quality checks ensured no respondents could take part in both variants and all respondents were unique.

Data processing and analysis

The young person questionnaire was tailored so that questions that were not relevant to young people were not asked to this part of the sample. However, changes between the two versions were minimal, allowing the data to be merged and treated as one wider dataset of respondents aged 14+ for the vast majority of survey questions.

With the exception of the coding of responses to open-ended questions, no data entry phase was required for this survey. The programmed script ensured that all question routing was performed automatically, and no post-editing of the data was required in the way that might be necessary for surveys administered using a 'Pencil and Paper' method.

Responses from questions with an 'other – specify' option were analysed and, if appropriate, back-coded into one of the pre-coded categories. If the response could not be assigned to an existing code but gained a sufficient number of mentions, a new code was created which all relevant responses were assigned to. Coding was carried out by a specialist team.

Data tables were produced using comprehensive spec that included down break and cross break definitions, as well as details of weights. The resulting tables were then used to analyse the data and report the findings. Each table included notation for significance testing – throughout the report the term "significant" is only used to describe differences between particular groups that are statistically significant to 95% confidence. This means that there is only a 5% probability that the difference has occurred by chance (a commonly accepted level of probability), rather than being a 'real' difference.

The report focuses on where statistically significant differences have been identified. Where differences are discussed during the commentary of the report, these differences can all be presumed to be statistically significant unless otherwise noted. Where results appear as though they should be statistically significant but have not been highlighted, this is due to a lower base size.

It is important to note that the online panel interviews rely on quota sampling. Despite being standard practice in social and market research, there are some theoretical caveats to bear in mind with using formal statistical significance tests on quota sample data including bias and lack of known sampling probability. Therefore, it is advised that any results of statistical significance tests are used as a guide and should always be interpreted with a degree of caution.

Weighting

The survey data used for this report is weighted to ensure the data is representative of the Scottish population aged 14+.

Results for respondents aged 14-17 were weighted by age, gender and region.[8] Results for those aged 18+ were also weighted by these variables, with the addition of targets for ethnicity and educational attainment.[9] Targets are provided in the tables below.

To ensure an adequate sample size for sub-group analysis, respondents aged 14-17 years old were purposefully oversampled. However, age weighting ensures that the total sample is not skewed as the proportion of those aged 14-17 is adjusted to be representative.[10]

Unweighted and weighted response counts by region, age and gender and some other metrics are presented in the sample profile in the Appendix.

Age by gender
Age group Female Male Total
14-17 2.3% 2.5% 4.8%
18-24 4.8% 5.0% 9.8%
25-34 8.1% 8.0% 16.1%
35-44 7.4% 7.2% 14.6%
45-54 8.2% 7.7% 15.9%
55-64 8.3% 7.8% 16.1%
65+ 12.4% 10.2% 22.6%
Total 51.7% 48.3% 100%
Region %
Central Scotland 12.1%
Glasgow 12.7%
Highlands and Islands 8.8%
Mid Scotland and Fife 12.1%
North East Scotland 14.2%
Lothian 14.2%
South Scotland 12.6%
West Scotland 13.3%
Ethnicity %
White British/Scottish/Irish 95.2%
Ethnic minority 4.8%
Highest qualification %
No / other qualifications 20.0%
Non-degree level qualifications 47.0%
Degree or above qualifications 33.0%

Contact

Email: ClimateChangeEngagement@gov.scot

Back to top