Scottish Marine and Freshwater Science Vol 6 No 14: Developing an avian collision risk model to incorporate variability and uncertainty

The report describes the data required, and the methods used, to estimate collision

risk. It is accompanied by a worked example and R code (available at

http://dx.doi.org/10.7489/1657-1), which enables the collision risk calculations to be

performed in


Appendix 1: Stakeholder Interviews

Purpose of interviews

To obtain views and opinions of a wide range of stakeholders involved in offshore wind, on collision risk models and modelling, particularly in relation to uncertainty and variability.

Interview questions

Conducted telephone interviews based around the following questions:

  1. How much experience do you have, relating to collision risk models/modelling?
  2. What collision risk models do you most regularly use or have experience of?
  3. What uncertainties exist in the collision risk models that you have used?
  4. What are the key uncertainties in input parameters?
  5. What parameters do you think have the greatest influence on the outputs of collision risk modelling?
  6. If you could, how would you improve collision risk models/modelling?
  7. Would the explicit reporting of variability and uncertainty in outputs from collision risk models benefit the consenting process and discussions with regulators?

Interviews were approximately 20-30 minutes each.

Interviewees

I contacted 30 people from a range of stakeholder groups and from those I conducted 20 interviews with people from the following organisations:

BTO

CEH

DONG Energy

ECON

EDPR

Joint Nature Conservation Committee

MacArthur Green

Marine Scotland Science

Natural England

Natural Power

NIRAS

Pelagica

PMSS

Royal Society for the Protection of Birds

Scottish Natural Heritage

Sue King Consulting

Statkraft/Forewind

The Crown Estate
…and Bill Band

Results

Experience of interviewees

Question 1: How much experience do you have, relating to collision risk models/modelling?

The experience of interviewees varied from 'intelligent client' to model creator. All interviewees had a good understanding of the general modelling process and the use of model output though not all had conducted the modelling and run the models themselves. One person declined the offer of being interviewed because they thought they didn't have enough experience to contribute constructively.

Question 2: What collision risk models do you most regularly use or have experience of?

All people interviewed (20) used the Band model and the associated updates. Of these, most people mentioned options 1 and 3 rather than 2 and 4. Additionally, 5 people used the Folkerts model, though less regularly, and one had an understanding of the Tucker model. These were the only models mentioned.

Uncertainties in collision risk modelling

Question 3: What uncertainties exist in the collision risk models that you have used?

This question was targeted at the broader uncertainties surrounding collision risk modelling. The following opinions were given more than once:

  • Data collection methods including number and timing of surveys and the fact that surveys only occur in good weather least to a density estimate which may not capture the variability in the environment.
  • The use of the Rochdale Envelope and therefore wide ranges for turbine parameters.
  • How much precaution should be included?
  • Bird behaviour and avoidance
  • Which option of the model (or in most cases, which option of Band) is acceptable?
  • Little empirical data and also no validation or comparison with post-construction data.
  • The appropriate use of the model and output. The collision estimate is considered as definitive and black and white when it is supposed to be a collision risk tool.
  • In the case of the Band model, what is the latest version of the model and flight height data sets to use?

Question 4: What are the key uncertainties in input parameters?

All of the input parameters were discussed and raised by the interviewees as a whole but those that occurred more than once and in descending order (most frequently highlighted first):

  • Flight height data
  • Avoidance
  • Density
  • Nocturnal activity
  • Flight speed
  • Rotor speed

Question 5: What parameters do you think have the greatest influence on the outputs of collision risk modelling?

Most of the input parameters were discussed and raised by the interviewees as a whole but those that occurred more than once and in descending order (most frequently highlighted first):

  • Avoidance rate
  • Flight height data
  • Rotor Speed
  • Density
  • Number of turbines
  • Which Band option used
  • Operation time

Changes or updates to model

Question 6: If you could, how would you improve collision risk models/modelling?

There were many different opinions on how to improve collision risk modelling but generally they did not involve making large changes to the mechanics of the model itself but rather to the input data or presentation of data and outputs. Comments that were raised more than once and in descending order (most frequently highlighted first) included:

  • Present a covering/summary sheet with input data values to ensure parameters are clearly set out and defined.
  • Stop presenting single numbers as black and white and also provide context.
  • Take data from existing sites to validate the model and also use post-construction monitoring.
  • Have a standard approach to derive turbine parameters and bird parameters including consistently defining breeding season periods.
  • More studies/data on bird behaviour around turbines and avoidance behaviour.
  • More and clearer guidance on the model and model use and intended use, especially on the tidal offset.
  • Collect flight height data objectively, not just human observation/estimation but using rangefinders.
  • Factor uncertainty into estimates.
  • Use R code rather than excel to make modelling process more reproducible.
  • Better interpretation of model outputs.
  • Single location to have the most up to date version of model and email updates.

These can then be split into comments that were more input data-related:

  • Present a covering/summary sheet with input data values to ensure parameters are clearly set out and defined.
  • Have a standard approach to derive turbine parameters and bird parameters including consistently defining breeding season periods.
  • More studies/data on bird behaviour around turbines and avoidance behaviour.
  • Collect flight height data objectively, not just human observation/estimation but using rangefinders.

Or those which were model or output data-related:

  • Stop presenting single numbers as black and white and also provide context.
  • Take data from existing sites to validate the model and also use post-construction monitoring.
  • More and clearer guidance on the model and model use and intended use, especially on the tidal offset.
  • Factor uncertainty into estimates.
  • Use R code rather than excel to make modelling process more reproducible.
  • Better interpretation of model outputs.
  • Single location to have the most up to date version of model and email updates.

Question 7: Would the explicit reporting of variability and uncertainty in outputs from collision risk models benefit the consenting process and discussions with regulators?

When asked more specifically about including variability and uncertainty in CRMs interviewees gave a wide range of responses but these were not consistent within different stakeholder groups. Of the 20 people interviewed, 13 agreed that including variability and uncertainty in outputs from collision risk models would benefit the consenting process and discussions with regulators, however 7 people disagreed. Of those 7, all said that they disagreed because of the consenting and assessment process and that in principle it would be better to include variability and uncertainty, but they though that the system did not allow for it. A recurrent comment was that interviewees were unsure of how variability and uncertainty could be included in outputs and still fit in with the Habitats Regulations.

Some comments and themes that were raised in the interviews are listed below:

  • Scientifically there is a benefit to making clear what the uncertainties are.
  • Accounting for uncertainty in data collection methods and survey data would be useful.
  • I am uncomfortable with presenting a value that is apparently so precise.
  • There is an absolute fixation on single numbers which is dangerous.
  • We need greater acceptance that we live and work in an uncertain world and things are grey, not black and white.
  • We need a way of showing that some scenarios are more likely than others.
  • Decision makers have to be confident that they are making the right decisions so they need to an understanding of uncertainty around the single numbers.
  • We need to weigh up risk (or use a risk assessment process) and we can't do that currently with CRM, though it happens more regularly with PVA.
  • The current approach is too precautionary and always uses the most precautionary values.
  • If the system were to change, including variability and uncertainty is a more useful approach.
  • Any outputs need to be suitable to be taken forward through the assessment process.
  • The risk is that it complicates the process even more than already because the more the risks are explicit the more difficult it is to explain to the planning inspectorate.
  • There is probably too much uncertainty in the system to make it useful to include it.

There was a wide range of views on some topics, for example opinions on using probability distributions:

  • Presenting probability distributions would help a lot because regulators often have a background of understanding risk probabilities.
  • Using probability distributions might help with presentation but it might not help with interpretation of outputs, especially if people don't understand how to interpret probability distributions.
  • Distributions are probably more helpful but people need to understand them.
  • Scientists are used to dealing with probabilities but legislation is binary.

This probably stems from uncertainty and/or inconsistency in (the understanding of) how decisions are made and the lack of a strategic decision on a standard method for presenting data which is most informative for the decision makers.

Contact

Back to top