Fish farm consenting pre-application pilots: independent evaluation report
Independent review of the fish farm consenting pre-application pilots.
2. Approach and methods
2.1 Purpose and scope
The approach prioritised a systematic and multi-method process, combining qualitative and quantitative data collection techniques to provide a comprehensive opportunity for relevant stakeholders to feed into the pilot sites.
To ensure that the engagement focused on all individuals and groups who were part of the pilots or had experience of the pilot pre-application process, members of the CTG were included in this data collection. Key stakeholder groups targeted from the pilot process include:
- Participants within the pilots (including developers and regulators);
- Other sector regulators and advisory bodies; and
- Wider groups with community interests.
These groups were selected due to their relevant perspectives and expertise to help inform the evaluation of the pilot process and help contribute to the successful implementation of the pilot to the wider sector. A list of all stakeholders, including which groups input into the questionnaire and the interview stages can be found in Appendix E Stakeholder List.
By combining insights from engagement with the target groups through stakeholder interviews and surveys, the project team aimed to ensure the reliability, validity, and richness of the evaluation findings. The methodology relied upon a number of activities which have been laid out in the following sections as well as the approach to ensure that dependencies and feasibility items were tackled head on. Examples of these considerations included:
- Successful stakeholder engagement depended on cooperation and participation from key stakeholders, necessitating proactive communication and relationship-building efforts;
- Access to relevant data and documents was contingent upon cooperation from regulatory bodies and industry partners, necessitating clear communication;
- The proposed timeline and resource allocation were designed to accommodate potential dependencies and feasibility issues, with built-in flexibility to adapt to evolving circumstances and stakeholder needs. Regular progress monitoring and communication helped to mitigate risks and ensure timely delivery of evaluation outputs; and
- The data collection tasks were planned to partially to take place prior to the conclusion of the pilots. Individual timelines were detailed upon early engagement with stakeholders and contingency planning took place to ensure that sufficient time is allowed for data collection.
2.2 Stakeholders
The list below shows the stakeholders who were contacted as part of this evaluation. Details are shown in Appendix E.
Participants within the trial
1. Bakkafrost
2. Mowi Scotland Ltd
3. Scottish Sea Farms
Other regulators or advisory bodies involved in the pre-application process
4. Crown Estate Scotland
5. Highland Council
6. Marine Directorate Licensing Team
7. Maritime and Coastguard Agency
8. NatureScot
9. Northern Lighthouse Board
10. Scottish Aquaculture Council
11. SEPA
12. Shetland Islands Council
Other stakeholders
13. British Trout Association
14. Cooke Aquaculture
15. Delting (Community)
16. Fisheries Management Scotland
17. Nesting and Lunnasting (Community)
18. Professor Russel Griggs
19. Yell (Community)
2.2.1 Engagement with stakeholders
The approach and method aimed to meet the objectives listed in Section 1.3, is detailed here. As far as possible, the stakeholder engagement process was undertaken in a manner that is aligned with the Stakeholder Engagement Standard (AccountAbility, 2015).
2.2.2 Stakeholder engagement plan
A Stakeholder Engagement Plan was drafted to set out in detail the steps that were to be taken to engage with stakeholders in order to achieve the aims and objectives of the project as outlined in the section above. It included the detailed engagement strategy outlining objectives, methods, and activities for engagement. It also included the specific communication channels, describing the protocols for information dissemination and feedback collection (including policies on data protection, GDPR and ethical elements), and determining the timing and frequency of engagement activities. This document served as a roadmap for fostering meaningful dialogue, building trust, and soliciting input from stakeholders throughout the evaluation process.
2.3 Quantitative data gathering
The aim of this task was to gather quantitative data from relevant stakeholders on the pilot pre-application process. The project team designed and distributed structured surveys to stakeholders, including industry, regulators, statutory advisory bodies, and other statutory and non-statutory consultees. The surveys were designed to collect quantitative data on stakeholders' satisfaction levels, their perceptions of the effectiveness of the pilot schemes, and specific areas of concern or improvement. Ultimately, this was designed to gather perspectives in line with the aimed outcomes of the CTG (listed in Section 1.1.1).
2.3.1 Survey design
Survey questions were carefully crafted to capture nuanced insights while allowing for analysis and comparison across different stakeholder groups. The questions were discussed and agreed with the Marine Directorate Aquaculture Consenting Policy team. Questionnaire design was constructed to allow for results to be divided into the pilot pre- application process stages (Stage 1, Stage 2, Stage 3, Stage 4, and general feedback across all stages) for further evaluation. When structuring the questions, several important considerations were taken into account.
- Clear and Specific Questions: Each survey question was designed to be clear, specific, and directly related to the pre-application process being piloted. Ambiguous or overly broad questions were avoided to prevent confusion or misinterpretation. Specific questions for each of the four stages of the pre-application process were included; the four stages of the pilot consenting process were:
- Stage 1. Request for pre-application advice;
- Stage 2. Provision of joint pre-application advice;
- Stage 3. Community and third-party engagement; and
- Stage 4. Screening/Scoping opinion request and issue of a joint scoping opinion report and advice.
- Multiple Choice and Ranking Questions: Multiple-choice questions and ranking tasks were incorporated to gather specific feedback on different elements of the pre-application process. For example, respondents were asked to rank the importance of various pre-application guidance materials or to select the most effective stakeholder engagement methods from a list of options.
- Skip Logic and Branching: Skip logic allows the questionnaire to skip over certain questions that are not applicable to a respondent, based on their previous response (reducing survey fatigue and avoiding irrelevant questions). Branching routes respondents down different paths in the survey depending on their answers (allowing more targeted, detailed responses). These tools were implemented within the survey.
- Demographic and Background Information: Demographic and background information were collected from respondents, such as their role in the aquaculture industry and their level of experience with the aquaculture consenting process. This demographic data helped identify potential patterns or differences in respondents' perceptions and experiences.
- Pilot Testing: Prior to full-scale deployment, pilot testing of the survey instrument was conducted with a small group of stakeholders. This allowed potential issues with question wording, response options, or survey flow to be identified. Feedback from pilot testing was incorporated to refine and improve the survey design for optimal data collection.
Likert Scale Responses: Likert scale is a common rating scale used in questionnaires to measure respondents’ attitudes, opinions, or perceptions. A statement was presented, and respondents indicated their level of agreement or disagreement on a symmetric scale. The Likert scale was structured with clearly defined response options, such as "Strongly Disagree," "Disagree," "Neutral," "Agree," and "Strongly Agree," enabling quantifiable data analysis.
2.3.2 Survey distribution
Surveys were distributed electronically to all targeted stakeholders (Appendix E). The surveys were hosted on a secure online platform (Microsoft Forms) accessible via web browsers. An information package was provided to participants, including a cover letter from the Marine Directorate. Respondents had the option to retain anonymity when providing their feedback (Name and contact information questions were optional), although it was made clear that their category would be collected (regulator, developer, etc.). However, due to the small number of stakeholders, full anonymity could not be guaranteed. In line with the UK General Data Protection Regulation (UK GDPR), data collection and storage procedures were designed to uphold the confidentiality and integrity of participant data. All data were stored on secure, access-restricted systems, and no personal data were shared or published. All responses were handled with care and used solely for the purpose of this evaluation. The advantages of this approach included:
- Appealed to those who are comfortable working with digital tools and who preferred electronic communication and online surveys;
- Streamlined data collection and management processes, reducing administrative overheads;
- Facilitated automated reminders and follow-ups, optimising response rates and completion rates.
2.4 Qualitative data gathering
The aim of this task was to gather qualitative data from relevant stakeholders on the pilot pre-application process. The aim of this qualitative analysis was to gather, measure and evaluate views of relevant stakeholders on the implementation of the pilot pre-application process, compared with the current consenting process. And to draw out key themes and thoughts on the proposed new process (through thematic analysis). Semi-structured interviews were designed and conducted with relevant stakeholders. These interviews were designed to gather in-depth qualitative insights into stakeholders' perceptions, experiences, and suggestions regarding the pilots and the pre-application consultation process.
The interviews also provided an opportunity for those to respond who were less likely to engage in the digital process as well as an opportunity for clarification and validation of quantitative findings obtained from those who did respond to the survey. Stakeholders were given opportunity to confirm or challenge survey results, providing alternative viewpoints, additional context, or counterexamples that enriched the interpretation of quantitative data and enhanced its credibility and reliability. By engaging stakeholders in dialogue and active listening, researchers uncovered underlying beliefs, values, and concerns that shaped stakeholders' perceptions and behaviours, contributing to a more holistic understanding of the feedback at hand.
2.4.1 Semi-structured interview engagement
A semi-structured interview is a qualitative research method that combines a pre-determined set of open questions (questions that prompt discussion) with the opportunity for the interviewer to explore responses further. Stakeholders were contacted by email after the survey had been delivered to schedule the interview. The delivery strategy for semi- structured interviews prioritised convenience and accessibility for participants. Interviews were conducted through online video conferencing platforms (Microsoft Teams) to accommodate participants' availability and maximise efficiency.
2.4.2 Semi-structured interview design
Interviews were conducted using a predetermined set of open-ended questions to ensure consistency while allowing participants to express their views freely. Stakeholders were prompted to focus on key areas such as resource requirements, financial cost, timescales, and any perceived or identified challenges within the pre-application process. The interview structure is in Appendix C.
In structuring qualitative data collection via semi-structured interviews tailored to the pilot pre-application process, several important strategies were adopted to ensure the data gathered was useful and analysable. The principles for designing this structure are summarised below:
- Establish clear objectives and framework: Before conducting interviews, clear objectives for the data collection process were defined, aligning them with the goals of the pre-application process pilots. A structured framework was developed that outlined key topics and themes to be explored during the interviews, ensuring that the discussions remained focused and relevant.
- Open-ended questions for depth and insight: The interviews were structured around open-ended questions to encourage participants to provide detailed insights and perspectives on various aspects of the pre-application process. These questions allowed for rich, nuanced responses and facilitated a deeper understanding of stakeholders' experiences.
- Probing and follow-up questions: To further explore topics of interest and elicit comprehensive responses, probing and follow-up questions were incorporated into the interview process. These questions allowed interviewers to delve deeper into specific issues, clarify responses, and uncover underlying motivations or concerns.
- Flexibility in interview structure: While adhering to the interview framework, flexibility was maintained in the interview structure to allow for spontaneous discussion and exploration of topics that arose during the conversation. This flexibility enabled adaptation of the discussion based on participants' insights and priorities, ensuring that all relevant areas were covered comprehensively.
- Sensitivity and ethical considerations: Sensitive topics were approached with empathy and sensitivity, creating a supportive environment in which participants felt comfortable sharing their perspectives and experiences. Participants' privacy and confidentiality were respected, with informed consent obtained before proceeding with the interview and anonymity preserved in any subsequent analysis.
By structuring semi-structured interviews in this manner, rich qualitative data was gathered, providing valuable insights into stakeholders' experiences, perceptions, and attitudes regarding the pre-application process. These data were instrumental in informing the evaluation of the pilot process and identifying areas for improvement.
2.4.3 Semi-structured interview delivery
Microsoft Teams' AI transcription tool was used for these interviews to generate accurate, real-time transcriptions of the interviews. This tool automatically converts speech into text during meetings or calls, using advanced speech recognition and natural language processing to capture dialogue with timestamps and speaker attribution. The transcriptions were saved within Teams, allowing researchers to review, search, and analyse the interview data efficiently while ensuring an accurate record of stakeholder responses. This was done in with participants' consent. A small team of interviewers was deployed, ensuring consistency in style. Transcripts were organised and supplemented with team notes to ensure comprehensive documentation of interview data.
2.5 Analysis of data
The results from the qualitative and quantitative data analysis are presented in Section 3.2 and discussed in Section 4.
2.5.1 Survey data analysis
Given the targeted nature of the survey, the relatively small number of participants allowed for a more detailed and nuanced analysis of individual responses. Rather than relying solely on broad statistical trends, the evaluation was able to focus on capturing insights from each respondent, ensuring that all perspectives were thoroughly considered.
Consequently, the evaluation focused on summarising the information gathered to provide a comprehensive overview of the findings. This approach ensured that all valuable insights were captured and presented effectively.
Where possible, the survey data was analysed in several ways to maximise learning. The project team examined Likert scale responses by calculating frequency distributions for each item, providing a clear picture of the distribution of opinions among respondents. This approach helped highlight patterns and trends in stakeholders' perceptions, even within a smaller sample size.
Additionally, central tendency measures such as the mean, median, and mode were calculated to assess the overall sentiment towards key aspects of the pre-application process.
- Mean is a measure of central tendency that represents the sum of all values in a dataset divided by the total number of values. It provides a general idea of where the centre of the data lies.
- Median is the middle value in a dataset when the numbers are arranged in ascending order. If there is an odd number of values, the median is the middle number. If there is an even number of values, the median is the average of the two middle numbers.
- Weighted median has been calculated in this case as different values in the dataset have different levels of frequency. Unlike the standard median, which simply finds the middle value, the weighted median considers how often each value occurs. This is particularly useful in survey data, like Likert scale responses, where some response categories may have more weight due to higher frequency.
- Mode is the most frequently selected response (Since Likert scale responses are categorical).
2.5.2 Semi-structured interview data analysis
To analyse the qualitative data, Braun and Clarke’s (2006) thematic analysis was used. Thematic analysis is a qualitative data analysis method that involves identifying patterns in qualitative data to derive sub-themes. This analysis determined common themes, topics, ideas, and patterns of meaning that appeared repeatedly within the data. To establish these common themes, an inductive approach was used. This required no predetermination of themes and instead allowed patterns in the data to determine the themes through coding.
The analysis followed six key steps:
- Familiarisation with the data: Reading the transcripts of interviews and actively observing meanings and patterns that appeared.
- Creation of initial codes: A set of initial codes was created to represent the meanings and patterns seen in the data. Codes were applied as appropriate.
- Collating codes with supporting data: All excerpts of data with the same code were grouped.
- Grouping codes into themes: Coded data was sorted into potential themes. These themes were used to identify trends and patterns in the data, such as the frequency with which certain themes were raised or discussed by stakeholders.
- Reviewing and revising themes: At this stage, themes were reviewed by other members of the team to ensure quality assurance and verification of consistency. Revisions were made where appropriate by revisiting the initial data to ensure that coding, sorting, and theme application were accurate and representative. Sub-themes were applied as needed.
- Writing up and summarising themes: Writing the narrative was the final step in analysis. The themes were presented and explained in the context of their frequency, urgency, and prioritisation from stakeholders. Representative quotes from the data set were used to demonstrate the themes. This narrative included an interpretive analysis from the project team.
2.5.3 Presentation of results
In presenting the results of the thematic analysis, we have included the frequency of occurrence for each theme and sub-theme as a means of highlighting their relative prevalence across the dataset. These frequencies reflect the total number of times each theme or sub-theme was referenced within the interview transcripts, rather than the number of unique respondents who mentioned them. As such, a single participant may have referred to the same theme multiple times in different contexts or stages of the discussion, and each of these instances has been included in the frequency count. This approach enables us to capture the emphasis and nuance of participants’ views, offering a richer understanding of the perspectives shared.
2.5.4 Collation of feedback received
This presentation of the results focuses on the four stages of the pre-application process and how effective the process was, including:
- Any perceived or identified barriers to delivery (including issues that arose and how they were mitigated);
- The overarching assessment of the pre-application process;
- Areas of improvement and recommendations on how to improve and who can help with these improvements;
- The impact of the pilots on downstream application activities; and
- Assessment of the approach to allow for adaptive leaning that can feed into the next trial in the pilot process.
These aspects of the results were analysed in line with the intended outcomes of the CTG as listed below.
1. Delays in the consenting process are minimised by removing unnecessary downtime, duplication, and non-value- added steps. Improved co-ordination between regulators to facilitate communication and streamline the consenting process.
2. The consenting process provides developers with an early understanding of potential constraints, leading to a reduced time to achieve all consents and ensures developers know and understand information required to support a regulatory decision.
3. The consenting process includes simple, clear mechanisms for informing and facilitating third party engagement. Improved transparency and community engagement by ensuring an effective and meaningful opportunity for communities, consultees, and other interest groups.
4. Identification of any remaining issues or areas for further exploration within a continuous improvement project.
Following the presentation of the results in the section below, Sections 4 and 5 present a discussion of those results and suggested next steps and recommendations for improvement of the pre application process including details on how these next steps and recommendations should be taken forward and by whom.
Contact
Email: AquacultureReview@gov.scot