1 / 104

"Review of major types of uncertainty in fisheries modeling and how to deal with them"

"Review of major types of uncertainty in fisheries modeling and how to deal with them". Randall M. Peterman School of Resource and Environmental Management (REM) Simon Fraser University, Burnaby, British Columbia, Canada. National Ecosystem Modeling Workshop II,

wirt
Download Presentation

"Review of major types of uncertainty in fisheries modeling and how to deal with them"

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. "Review of major types of uncertainty in fisheries modeling and how to deal with them" Randall M. Peterman School of Resource and Environmental Management (REM) Simon Fraser University, Burnaby, British Columbia, Canada National Ecosystem Modeling Workshop II, Annapolis, Maryland, 25-27 August 2009

  2. Outline • Five sources of uncertainty - Problems create - What scientists have done • Adapting those approaches for ecosystem modelling • Recommendations

  3. Single-species stock assessments

  4. Single-species stock assessments Uncertainties considered General risk assessmentmethods

  5. My background Single-species stock assessments Uncertainties considered General risk assessmentmethods

  6. Single-species stock assessments Risk management Uncertainties considered Scientific advice: including risk communication General risk assessmentmethods Decision makers, stakeholders

  7. Single-species stock assessments Risk management Uncertainties considered Scientific advice: including risk communication General risk assessmentmethods Multi-species ecosystem models Decision makers, stakeholders Impressive!!

  8. Single-species stock assessments Risk management Uncertainties considered Scientific advice: including risk communication General risk assessmentmethods Uncertainties considered Multi-species ecosystem models Decision makers, stakeholders

  9. Single-species stock assessments Risk management Uncertainties considered Scientific advice: including risk communication General risk assessmentmethods Uncertainties considered Multi-species ecosystem models Decision makers, stakeholders

  10. Purposes of ecosystem models from NEMoW 1 1. Improve conceptual understanding 2. Provide broad strategic advice 3. Provide specific tactical advice Uncertainties are pervasive ...

  11. Sources of uncertainty 1. Natural variability Uncertainties

  12. Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability Uncertainties

  13. Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Uncertainties

  14. Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Parameter uncertainty Uncertainties

  15. Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Parameter uncertainty Uncertainties 4. Outcome uncertainty (deviation from target)

  16. Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Parameter uncertainty Uncertainties 4. Outcome uncertainty (deviation from target) Result: Imperfect forecasts of system's dynamics

  17. Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Parameter uncertainty Uncertainties 4. Outcome uncertainty (deviation from target) 5. Inadequate communication among scientists, decision makers, and stakeholders Result: Imperfect forecasts of system's dynamics

  18. Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Poorly informed decisions Result: Parameter uncertainty Uncertainties 4. Outcome uncertainty (deviation from target) 5. Inadequate communication among scientists, decision makers, and stakeholders Result: Imperfect forecasts of system's dynamics

  19. Economic risks (industry) Social risks (coastal communities) Biological risks (ecosystems) Uncertainties Risk: Magnitude of variable/event and probability of that magnitude occurring

  20. 1. Which components to include 4. Management objectives 2. Structural forms of relationships 5. Environmental conditions 3. Parameter values 6. Management options Sensitivity analyses across: • Focus: - Which parts most affect management decisions? - Which parts are highest priority for more data?

  21. 2008 Mutton snapper U.S. South Atlantic & Gulf of Mexico Overfishing F / F30% Overfished SSB / SSBF30%

  22. Sources of uncertainty Problems 1. Natural variability Resolution 2. Observation error 3. Unclear structure of fishery system 4. Outcome uncertainty 5. Inadequate communication

  23. What scientists have done to deal with ... 1. Natural variability 1. Simulate stochastically 2. Make parameters a function of age, size, density, ... 3. Include other components (static or dynamic) - Predators, prey, competitors - Bycatch/discards - Environmental variables ...

  24. Sources of uncertainty 1. Natural variability 2. Observation error 3. Unclear structure of fishery system 4. Outcome uncertainty 5. Inadequate communication

  25. What scientists have done to deal with ... 2. Observation error 1. Assume % of total variance due to observation error 2. Conduct sensitivity analyses 3. Use hierarchical models that "pool" information to help "average out" annual observation error - Jerome Fiechter et al. using hierarchical Bayesian models on NEMURO (NPZD-based)

  26. Stock number Pink salmon Separate single- stock analyses Alaska 40 North Multi-stock, mixed-effects model 30 20 South B.C., Wash. 10 1 , change in salmon productivity, loge(R/S),per oC increase in summer sea-surface temperature -0.5 0.0 0.5 1.0 gi Mueter et al. (2002a)

  27. 2. Observation error ... (continued) 4. Separately estimate natural variation and observation error -- Errors-in-variables models -- State-space models -- Kalman filter Example 1: tracking nonstationary productivity parameter (Ricker  value)

  28. 3 2 Productivity parameter 1 Low High Decreasing 0 0 10 20 30 40 50 60 70 80 90 100 Year

  29. Simulation test "True" Standard method Kalman filter 3 Productivity (Ricker  parameter) 2 1 0 0 20 40 60 80 100 Year • Kalman filter with random-walk system equation was best across all types of nonstationarity Peterman et al. (2000)

  30. 2. Observation error ... (continued) Example 2 of observation error and natural variation Simplest possible model: spawner-recruit relationship Su and Peterman (2009, in prep.) - Used operating model to determine statistical properties of various parameter-estimation schemes: -- Bias -- Precision -- Coverage probabilities (accuracy of estimated width of probability interval for a parameter)

  31. Test performance of an estimator User-specified "true" underlying parameter values ("What if ...?") Operating model (simulator to test methods)

  32. Test performance of an estimator User-specified "true" underlying parameter values ("What if ...?") Operating model (simulator to test methods) Generate "observed data" from natural variation and observation error Parameters estimated

  33. Test performance of an estimator User-specified "true" underlying parameter values ("What if ...?") Operating model (simulator to test methods) Generate "observed data" from natural variation and observation error Compare "true" and estimated values Parameters estimated

  34. Test performance of an estimator User-specified "true" underlying parameter values ("What if ...?") Operating model (simulator to test methods) 200 trials Generate "observed data" from natural variation and observation error Compare "true" and estimated values Parameters estimated

  35. Harvest-rate history LowVariableHigh Extended Kalman filter Errors-in-variables Bayesian state-space Standard Ricker 250 X * % relative bias in  150 True = 2 50 0 -50 0.25 0.75 0.25 0.75 0.25 0.75 Proportion of total variance due to measurement error • Results also change with true

  36. Results for 95% coverage probabilities - Uncertainty in estimated  is too narrow (overconfident) for all 4 estimation methods Estimated Probability Actual Ricker  - Trade-off between bias and variance (Adkison 2009, Ecol. Applic. 19:198)

  37. Recommendation • Test parameter estimation methods before applying them (Hilborn and Walters 1992) • Use results with humility, caution - Parameter estimates for ecosystem models may inadvertently be quite biased!

  38. Sources of uncertainty 1. Natural variability 2. Observation error 3. Unclear structure of fishery system 4. Outcome uncertainty 5. Inadequate communication

  39. What scientists have done to deal with ... 3. Unclear structure of fishery system 1. Choose single "best" model among alternatives 1a. Informally 1b. Formally using model selection criterion (AICc) Caution!! - Not appropriate for giving management advice- Asymmetric loss functions (Walters and Martell 2004, p. 101)

  40. Asymmetric loss: Which case is preferred? Case 1 Case 2 1.0 SSB / SSBmsy 0.6 0.2 A B A B Species

  41. Spawning favoured Harvest favoured 1.18 1.33 0.68 2.09 0.30 1.44 0.25 0.5 1 2 4 4 2 1.0 0.5 0.25 Preference ratio Fraser River Early Stuart sockeye salmon: Best "management-adjustment" model (H, T, Q, T+Q) Asymmetric with spawning obj. favored Symmetric Asymmetric with harvest obj. favored Recommendation • To develop appropriate indicators, ecosystem scientists should understand asymmetry in managers' objectives, especially given many species. Cummings (2009)

  42. What scientists have done to deal with ... 3. Unclear structure of fishery system 1. Choose single "best" model among alternatives ... ... 1c. Adaptive management experiment - Sainsbury et al. in Australia More commonly, we have to consider a range of alternative models ...

  43. 3. Unclear structure of fishery system ... (cont'd.) 2. Retain multiple models; conduct sensitivity analyses 2a. Analyze separately Eastern Scotian Shelf cod (closed in mid-1990s) M values VPA F*1000 Stock- synthesis Delay-diff. (R. Mohn 2009) SSB (thousands of tonnes)

  44. 3. Unclear structure of fishery system ... (cont'd.) 2. Retain multiple models; conduct sensitivity analyses 2a. Analyze separately 2b. Combine predictions from alternative models - Unweighted model averaging - Weighted with AIC weights or posterior probab., then calculate expected values of indicators • But weighting assumes managers useexpected value objectives - Many use mini-max objectives (i.e., choose action with lowest chance of worst-case outcome)

  45. 0.2 150 0.1 100 0.05 50 0 0 Limit reference point Probability with manage- ment action A Expected SSB (weighted average) 0.05 0 0 1.0 0.25 0.5 0.75 1.25 0.25 0.5 Worst-case outcome (unlikely, but choose action with lowest probability ) SSB/SSBtarget

  46. Recommendation • Ecosystem scientists should work iteratively with managers to find the most useful indicators to reflect management objectives.

  47. 3. Unclear structure of fishery system ... (cont'd.) 2. Retain multiple models; conduct sensitivity analyses ... ... 2c. Evaluate alternative ecosystem assessment modelsby using an operating model to determine their statistical properties (e.g., Fulton et al. 2005 re: community indicators)

More Related