SEARCH

SEARCH BY CITATION

Keywords:

  • EU fisheries-management advice;
  • expert judgement;
  • management-strategy evaluations;
  • scientific credibility;
  • simulations;
  • uncertainty

Abstract

Scientists feel discomfort when they are asked to create certainty, where none exists, for use as an alibi in policy-making. Recently, the scientific literature has drawn attention to some pitfalls of simulation-based fisheries management-strategy evaluation (MSE). For example, while estimates concerning central tendencies of distributions of simulation outcomes are usually fairly robust because they are conditioned on ample data, estimates concerning the tails of distributions (such as the probability of falling below a critical biomass) are usually conditional on few data and thus often rely on assumptions that have no strong knowledge base. The clients of scientific advice, such as the European Commission, are embracing the mechanization of the evaluation of proposed Harvest Control Rules against the precautionary principle and management objectives. Where the fisheries management institutions aim for simple answers from the scientists, giving ‘green/red light’ to a proposed management strategy, the scientists are forced into a split position between satisfying the demands of their advisory role and living up to the standards of scientific rigour. We argue against the mechanization of scientific advice that aims to incorporate all relevant processes into one big model algorithm that, after construction, can be run without circumspection. We rather encourage that fisheries advice should be a dynamic process of expert judgement, incorporating separate parallel concurrent, lines of scientific evidence, from quantitative and qualitative modelling exercises and factual knowledge of the biology and the fishery dynamics. This process can be formalized to a certain degree and can easily accommodate stakeholder viewpoints.