Incorporating Uncertainty into Fishery Models
Latest Publications


TOTAL DOCUMENTS

12
(FIVE YEARS 0)

H-INDEX

0
(FIVE YEARS 0)

Published By American Fisheries Society

9781888569315

<em>Abstract.—</em>Spring and summer chinook salmon <em>Oncorhynchus tshawytscha </em>populations of the Snake River basin provide the setting for an application of Bayesian analysis to derive risks of population survival and recovery for these endangered populations. The Bayesian approach is appealing because it provides a theoretical framework within which uncertainty about population dynamics is directly translated into measures of probability of achieving various population abundance targets, given certain types of actions in the future. Uncertainty about parameters governing the population dynamics is based on an application of the Bayes Theorem to the likelihood of observations about past recruitment, as viewed in the context of a generalized Ricker spawner and recruitment model. Uncertainty about future dynamics is based on simulations of population abundance over the next 100 years, and they contain both model parameter uncertainty and annual stochastic elements affecting survival. Results show substantial reductions in mortality rate (on the order of 0.5–0.7 per year, as compared with rates in recent years) are required in order for the populations to meet recovery and survival standards set for the next 48– 100 years. The level of mortality reduction needed to achieve these standards can assist in guiding potential hydropower system management options.


<em>Abstract.—</em>Stock assessments of Atlantic menhaden are conducted annually for the Atlantic States Marine Fisheries Commission, as required by the recently updated Fishery Management Plan, adopted in 1992. Uncertainties in stock assessments have been explored over the years from many perspectives. Two general areas of analysis are considered here. The first area is largely deterministic and concerns the virtual population analysis (VPA), including development and coherence of the catch-at-age matrix; historical retrospective problems; implications of assuming constant <EM>M </EM>at all ages analyzed; and reliability of recruitment estimates relative to fishery-independent juvenile abundance indices when used for calibrating the VPA. The second area of consideration comprises stochastic analyses, including stochastic projections based on biological benchmarks determined from yield-per-recruit and spawning-stockbiomass- per-recruit models; bootstrapped application of a surplus-production model; and projections from that production model. Nonetheless, the largest uncertainty in assessment of the stock stems not from modeling considerations, but is a biological question: Can the high stock levels observed in the 1950s be regained by reducing fishing mortality? Projections based on production modeling assume that they can, but if exogenous forces (for example, habitat loss or pollution) have affected the stock, it may be that they cannot. If the recent pattern of lower fishing mortality rates in response to social and economic factors continues, the fishery will in essence conduct an experiment that may answer the question.


<em>Abstract.—</em> The stock assessment analyses of king and Spanish mackerel fisheries of the southeastern United States have a long history of incorporating uncertainty. The development of this philosophy resulted from a number of unique circumstances, both biological and historical, that encouraged the incorporation of stochastic approaches and risk evaluation to the assessment and management process. The progression from simple discrete decision tree analysis to delta methods to Monte Carlo/bootstrap methods was due not only to advances in assessment technology but also to changing requirements for management. The current method for mackerel stock assessment is a tuned virtual population analysis with uncertainty incorporated via a mixed Monte Carlo/bootstrap algorithm. Through this procedure, uncertainty in the tuning indices, catch-at-age and natural mortality rate are directly incorporated into the advice provided to management. The management advice is given in terms of probability statements, as opposed to point estimates, to reflect this uncertainty in the stock assessments. This approach is a result of the evolution of the assessment and management and provides a pragmatic alternative in the “frequentist versus Bayesian” debate.


<em>Abstract.—</em>Stock assessment methodology has increasingly employed statistical procedures as a means to incorporate uncertainty into assessment advice. Deterministic values of fishing mortality rates (<em>F<sub>t </sub></em>) estimated from assessment models have been replaced by empirical distributions that can be compared with an appropriate biological reference point (<em>F</em><sub>BRP</sub>) to generate statements of probability (e.g., Pr[<em>F<sub>t </sub></em>≥ <em>F</em><sub>BRP</sub>]) regarding the status of the resource. It must be recognized, however, that terminal year fishing mortality rates and the biological reference points to which they are compared are both estimated with error, which will impact the quality of decisions regarding the status of the stock. We propose a two-tier stochastic decision-based framework for a recently conducted stock assessment of the Delaware Bay blue crab stock that specifies not only the probability for the condition Pr(<em>F<sub>t </sub></em>≥ <em>F</em><sub>BRP</sub>), but also the statistical level of confidence (i.e., 90%) in that decision. The approach uses a mixed Monte Carlobootstrap procedure to estimate probability distributions for both the terminal year fishing mortality rate (<em>F<sub>t </sub></em>) and the replacement fishing mortality rate, approximated by <em>F</em><sub>MED</sub> as an overfishing definition. Probability density functions (PDFs) for <em>F<sub>t </sub></em>and <em>F</em><sub>MED</sub>, generated using the mixed Monte Carlo-bootstrap procedure, show that recent fishing mortality rates (80% CI from 0.6 to 1.2) are generally below the <em>F</em><sub>MED</sub> overfishing definition (80% CI from 0.9 to 1.6), with significant overlap in the PDFs. Using the PDFs, the stochastic decision-based approach then generates a probability profile by integrating the area under the <em>F<sub>t </sub></em>PDF for different decision confidence levels (e.g. 90%, 80%, 70%, etc.), which can be thought of as one-tailed <em>α</em>-probability from standard statistical hypothesis testing. For example, at the 80% decision confidence level (value of <em>F </em>corresponding to the upper 20% of the <em>F</em><sub>MED</sub> PDF), Pr(<em>F<sub>t </sub>> F</em>MED) is about 0.03. Thus, with high confidence (80%), we can state that the blue crab stock is not currently being overfished. This approach can be extended to decisions regarding control laws that specify both maximum fishing rate and minimum biomass thresholds.


<em>Abstract.—</em>Management schemes for marine mammals developed by the United States and International Whaling Commission (IWC) have sought to achieve their management objectives by developing control laws designed to calculate acceptable levels of human-caused mortality, while explicitly incorporating some types of uncertainty, and while being robust to other types of uncertainty. The United States developed the “potential biological removal” control law in managing commercial fisheries to reduce incidental catches of marine mammals. The IWC developed, but has not yet used the “catch limit algorithm” control law in managing commercial harvests of baleen whales. In both cases, to develop and test the control law, quantitative management objectives were specified, and only reliably and easily collected data were required. Then, given these specifications and requirements, simulations were used to define the control law and test its robustness to uncertainties in assumptions and data. Finally, to identify unforeseen uncertainties, the management schemes include rules and guidelines on reviews, monitoring programs, and data collection and analyses.


<em>Abstract.—</em> This paper focuses on greater amberjack management in the South Atlantic region, with an examination of the deficiencies in data collection and resulting impacts on the amberjack stock assessment and management decisions. This is a case study of uncertainty leading to more conservative, risk-averse management. Amberjack management in the South Atlantic is illustrative of the debate of whether you do what you can with what you have versus not doing anything until you have good information. Ultimately one has to ask what are the consequences to resource management. While National Standard #2 says you must base your decisions on the best scientific information, scientific information entails more than just the stock assessment results. It entails other biological information as well as social and economic information. The South Atlantic Fishery Management Council went as far as we could without opening ourselves up to a legal challenge. We attempted to balance concern for the resource, Florida’s desire to be more conservative, skepticism about the data, and the fishermen’s desire to fish.


<em>Abstract.—</em>The uncertainty associated with estimates of stock size is increasingly acknowledged in the provision of management advice. Estimated variances, however, are usually small compared with the variability of abundance estimates for any given year produced by successive assessments, especially when changes in the assessment methods are introduced. Of all the different kinds of uncertainty, uncertainty in the specification of the model structure is often the most significant source of assessment errors in some closely monitored fisheries. Recent changes in Pacific halibut assessments illustrate this problem. A separable catch-at-age model used since the mid-1980s performed very poorly in retrospective analyses, initially overestimating biomass and then underestimating it in the 1990s. The latter has been attributed to trends in catchability at age associated with a remarkable decrease in halibut growth rate over the last 15 years. A new model was developed which replaced the assumptions of constant catchability and selectivity made in the old model by a more flexible and realistic treatment of observation and process variability. The change in model structure resulted in estimates of present biomass more than double the previous estimates. While retrospective performance of the new model is much improved, major uncertainties still remain. In particular, the relative importance of size and age effects in determining catchability of the setline surveys is difficult to discern from the data. Two extreme models, one based on the assumption that survey selectivity is a function of size and the other based on the assumption that survey selectivity <em>at age </em>is constant, have been formulated to incorporate this uncertainty. In this paper, Bayesian methods are used to evaluate the uncertainty around abundance estimates and short-term forward projections, including the uncertainty due to alternative possible model structures. The posterior distributions of parameters of interest under each of the two models are approximated by Markov Chain Monte Carlo methods, and the support given to the models by the data is evaluated by computing their integrated likelihoods. While far from a complete representation of all sources of model uncertainty, the analysis illustrates how uncertainty in model choice, in addition to the standard parameter uncertainty, can be incorporated in risk computations.


Sign in / Sign up

Export Citation Format

Share Document