Clyde Seasonal Closure 2026: consultation analysis and SG response
Analysis of the consultation on the Clyde Seasonal Closure 2026 to 2028 and the Scottish Government response to the consultation.
Consultation
3. Methodology
This analysis followed Scottish Government Consultation Good Practice Guidance[3] and qualitative research best practice informed by colleagues in the Marine Analytical Unit (MAU).
Responses were received through the consultation portal (Citizen Space) and via email. Both quantitative and qualitative methods were used to ensure a robust and balanced understanding of views.
- Quantitative analysis - All multiple-choice questions (Q1–Q6) were analysed to identify overall response distributions and levels of agreement or disagreement.
- Qualitative analysis - Free-text responses to all eight questions were coded systematically using a thematic framework (1A–6C) developed iteratively through reading responses and aligning with consultation objectives. Qualitative analysis hence enriches and contextualises quantitative findings.
- Coding structure - Each response could be assigned to multiple themes but counted only once per theme to avoid duplication.
- Key responses - Key or “highlighted” responses were identified for their policy relevance, stakeholder input or evidentiary strength and were reviewed separately.
- Mixed-method approach – Combining both descriptive statistics how respondents answered each question and qualitative insight into their reasoning, values, and priorities via thematic coding (i.e. what respondents thought and why).
3.1 Internal Consistency Framework
The internal consistency framework – the method for describing response patterns - was applied throughout the analysis to describe relative magnitudes of opinion and ensure they were accurate, transparent, and proportionate. This structured aligns qualitative descriptors (e.g., majority, plurality, several) with their corresponding quantitative ranges, ensuring that narrative phrasing remains empirically grounded and consistent across questions (see table 1), strengthening analytical ability.
By combining absolute response counts (n = 44) with equivalent percentage ranges, the framework allows for consistency when describing patterns in small sample consultations. Terms such as majority or minority are only used when empirically justified, reducing the risk of overstating consensus and enhancing comparability across findings.
For instance, majority denotes ≥ 55% of total responses, plurality identifies the largest single category below 50% and several refers to 30-44%. This proportional method reflects good practice, allowing nuanced interpretation of divergent or conditional views while maintaining transparency in quantitative reasoning.
Percentages were rounded to the nearest whole number; therefore, cumulative totals may not sum to exactly 100%. Such rounding conventions are standard practice in social research and government consultation analysis, as minor discrepancies are not analytically significant.
| Descriptor | Range of Responses | Equivalent % Range | Interpretive Guidance |
|---|---|---|---|
| Majority/Most | 24 - 44 | ≥ 55% of total responses | More than half share this view; dominant but not overwhelming agreement. |
| Plurality | Typically 17-23 | Largest single category below 50% | The most common view among respondents where no single opinion holds a majority. Indicates the most common view among divided opinions. |
| Around half/evenly split | 20 - 24 | 45 - 55% combined in two similar categories (e.g., Agree + Strongly Agree) | Indicates broad balance or mid-range support; may suggest conditional or mixed sentiment rather than strong consensus. |
| Several | 13 - 19 | 30 - 44% | Substantial minority; noticeable proportion sharing this view but not dominant. |
| Some | 9 - 12 | 20 - 29% | Moderate minority; reflects a clear but limited level of support. |
| A few/small minority | 4 - 8 | 10 - 19% | Small group expressing this view; marginal but notable presence. |
| One or two/isolated | < 4 | < 10% | Very few respondents hold this view; outlier or exceptional opinion. |
Note: This framework maintains alignment between narrative terms such as “majority opposed” or “plurality neutral” and the quantitative evidence, preventing overstatement of consensus in small, diverse samples.
3.2 Quantitative Analysis
All multiple choice questions (Q1–Q6) were analysed to identify distributions of agreement, disagreement, and neutrality. Percentages were calculated based on the number of respondents who answered each specific question, rather than the full consultation total (n = 44). For example, if 16 respondents selected “Agree” and 43 respondents answered that question, the percentage recorded was 37% (16 ÷ 43 × 100). Percentages were rounded up to the nearest whole number, meaning cumulative totals may not always sum to exactly 100%.
The internal consistency framework (Section 3.1) was applied throughout to align narrative descriptors with quantitative ranges. Terms such as “majority,” “plurality,” “several,” or “a few” were only used when empirically justified, ensuring that narrative phrasing remained transparent and proportionate. This approach prevents overstatement of consensus in small, diverse samples and strengthens comparability across findings.
This quantitative methodology provided the baseline proportions for each question, which were then interpreted alongside qualitative insights to ensure a balanced understanding of stakeholder perspectives.
3.3 Qualitative Analysis
Free-text responses to all eight consultation questions (Q1–Q8) were analysed using a systematic coding framework developed iteratively during the review process. The framework was aligned with consultation objectives and refined through repeated reading of responses to ensure consistency and depth.
Each response could be assigned to multiple themes where relevant but was counted only once per theme to avoid duplication. Coding was conducted at both the descriptive level (capturing the explicit content of responses) and the interpretive level (identifying underlying reasoning, conditions, or values). Key or “highlighted” responses were flagged separately for their policy relevance, evidentiary strength, or stakeholder significance.
Thematic analysis followed qualitative research best practice, ensuring transparency and replicability:
- Framework development: Themes were generated inductively from the data and aligned with consultation aims.
- Coding process: Responses were coded consistently, with checks for accuracy and proportionality.
- Theme assignment: Responses were grouped into overarching categories, allowing identification of recurring issues and cross‑cutting concerns.
- Integration with quantitative analysis: Qualitative findings were used to contextualise and enrich the numerical distributions, explaining why respondents selected particular options and highlighting conditions attached to their choices.
To structure interpretation, responses were grouped into six overarching themes, each comprising multiple subthemes (Table 2). The full coding framework — including subtheme descriptions and cue examples — is provided in Appendix 1. This framework supported consistent coding across responses and enabled nuanced synthesis of stakeholder views.
| Theme | Code Range | Description |
|---|---|---|
| 1. Management measures | 1A–1F | Closure timing and duration, enforcement, gear restrictions, area boundaries, exemptions, and bycatch management. |
| 2. Socioeconomic impacts | 2A–2B | Effects on livelihoods, broader economic impacts, and calls for mitigation. |
| 3. Evidence and science | 3A–3C | The role of scientific evidence, monitoring needs, and questions around the current evidence base. |
| 4. Process and engagement | 4A–4C | Consultation quality, stakeholder involvement, and communication. |
| 5. Environmental outcomes | 5A–5B | Conservation benefits and ecosystem health. |
| 6. Balancing environment & socioeconomics | 6A–6C | Emphasis on adaptive, proportionate, and collaborative approaches. |
This qualitative methodology provided the interpretive depth necessary to understand stakeholder perspectives beyond numerical counts. The detailed findings under each theme, with illustrative examples, are presented in the Results section, where they are integrated with quantitative analysis to provide a balanced account of consultation outcomes.
Contact
Email: inshore@gov.scot