1.06k likes | 1.25k Views
"Review of major types of uncertainty in fisheries modeling and how to deal with them". Randall M. Peterman School of Resource and Environmental Management (REM) Simon Fraser University, Burnaby, British Columbia, Canada. National Ecosystem Modeling Workshop II,
E N D
"Review of major types of uncertainty in fisheries modeling and how to deal with them" Randall M. Peterman School of Resource and Environmental Management (REM) Simon Fraser University, Burnaby, British Columbia, Canada National Ecosystem Modeling Workshop II, Annapolis, Maryland, 25-27 August 2009
Outline • Five sources of uncertainty - Problems create - What scientists have done • Adapting those approaches for ecosystem modelling • Recommendations
Single-species stock assessments
Single-species stock assessments Uncertainties considered General risk assessmentmethods
My background Single-species stock assessments Uncertainties considered General risk assessmentmethods
Single-species stock assessments Risk management Uncertainties considered Scientific advice: including risk communication General risk assessmentmethods Decision makers, stakeholders
Single-species stock assessments Risk management Uncertainties considered Scientific advice: including risk communication General risk assessmentmethods Multi-species ecosystem models Decision makers, stakeholders Impressive!!
Single-species stock assessments Risk management Uncertainties considered Scientific advice: including risk communication General risk assessmentmethods Uncertainties considered Multi-species ecosystem models Decision makers, stakeholders
Single-species stock assessments Risk management Uncertainties considered Scientific advice: including risk communication General risk assessmentmethods Uncertainties considered Multi-species ecosystem models Decision makers, stakeholders
Purposes of ecosystem models from NEMoW 1 1. Improve conceptual understanding 2. Provide broad strategic advice 3. Provide specific tactical advice Uncertainties are pervasive ...
Sources of uncertainty 1. Natural variability Uncertainties
Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability Uncertainties
Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Uncertainties
Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Parameter uncertainty Uncertainties
Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Parameter uncertainty Uncertainties 4. Outcome uncertainty (deviation from target)
Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Parameter uncertainty Uncertainties 4. Outcome uncertainty (deviation from target) Result: Imperfect forecasts of system's dynamics
Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Parameter uncertainty Uncertainties 4. Outcome uncertainty (deviation from target) 5. Inadequate communication among scientists, decision makers, and stakeholders Result: Imperfect forecasts of system's dynamics
Sources of uncertainty 2. Observation error (bias and imprecision) 1. Natural variability 3. Structural complexity Result: Poorly informed decisions Result: Parameter uncertainty Uncertainties 4. Outcome uncertainty (deviation from target) 5. Inadequate communication among scientists, decision makers, and stakeholders Result: Imperfect forecasts of system's dynamics
Economic risks (industry) Social risks (coastal communities) Biological risks (ecosystems) Uncertainties Risk: Magnitude of variable/event and probability of that magnitude occurring
1. Which components to include 4. Management objectives 2. Structural forms of relationships 5. Environmental conditions 3. Parameter values 6. Management options Sensitivity analyses across: • Focus: - Which parts most affect management decisions? - Which parts are highest priority for more data?
2008 Mutton snapper U.S. South Atlantic & Gulf of Mexico Overfishing F / F30% Overfished SSB / SSBF30%
Sources of uncertainty Problems 1. Natural variability Resolution 2. Observation error 3. Unclear structure of fishery system 4. Outcome uncertainty 5. Inadequate communication
What scientists have done to deal with ... 1. Natural variability 1. Simulate stochastically 2. Make parameters a function of age, size, density, ... 3. Include other components (static or dynamic) - Predators, prey, competitors - Bycatch/discards - Environmental variables ...
Sources of uncertainty 1. Natural variability 2. Observation error 3. Unclear structure of fishery system 4. Outcome uncertainty 5. Inadequate communication
What scientists have done to deal with ... 2. Observation error 1. Assume % of total variance due to observation error 2. Conduct sensitivity analyses 3. Use hierarchical models that "pool" information to help "average out" annual observation error - Jerome Fiechter et al. using hierarchical Bayesian models on NEMURO (NPZD-based)
Stock number Pink salmon Separate single- stock analyses Alaska 40 North Multi-stock, mixed-effects model 30 20 South B.C., Wash. 10 1 , change in salmon productivity, loge(R/S),per oC increase in summer sea-surface temperature -0.5 0.0 0.5 1.0 gi Mueter et al. (2002a)
2. Observation error ... (continued) 4. Separately estimate natural variation and observation error -- Errors-in-variables models -- State-space models -- Kalman filter Example 1: tracking nonstationary productivity parameter (Ricker value)
3 2 Productivity parameter 1 Low High Decreasing 0 0 10 20 30 40 50 60 70 80 90 100 Year
Simulation test "True" Standard method Kalman filter 3 Productivity (Ricker parameter) 2 1 0 0 20 40 60 80 100 Year • Kalman filter with random-walk system equation was best across all types of nonstationarity Peterman et al. (2000)
2. Observation error ... (continued) Example 2 of observation error and natural variation Simplest possible model: spawner-recruit relationship Su and Peterman (2009, in prep.) - Used operating model to determine statistical properties of various parameter-estimation schemes: -- Bias -- Precision -- Coverage probabilities (accuracy of estimated width of probability interval for a parameter)
Test performance of an estimator User-specified "true" underlying parameter values ("What if ...?") Operating model (simulator to test methods)
Test performance of an estimator User-specified "true" underlying parameter values ("What if ...?") Operating model (simulator to test methods) Generate "observed data" from natural variation and observation error Parameters estimated
Test performance of an estimator User-specified "true" underlying parameter values ("What if ...?") Operating model (simulator to test methods) Generate "observed data" from natural variation and observation error Compare "true" and estimated values Parameters estimated
Test performance of an estimator User-specified "true" underlying parameter values ("What if ...?") Operating model (simulator to test methods) 200 trials Generate "observed data" from natural variation and observation error Compare "true" and estimated values Parameters estimated
Harvest-rate history LowVariableHigh Extended Kalman filter Errors-in-variables Bayesian state-space Standard Ricker 250 X * % relative bias in 150 True = 2 50 0 -50 0.25 0.75 0.25 0.75 0.25 0.75 Proportion of total variance due to measurement error • Results also change with true
Results for 95% coverage probabilities - Uncertainty in estimated is too narrow (overconfident) for all 4 estimation methods Estimated Probability Actual Ricker - Trade-off between bias and variance (Adkison 2009, Ecol. Applic. 19:198)
Recommendation • Test parameter estimation methods before applying them (Hilborn and Walters 1992) • Use results with humility, caution - Parameter estimates for ecosystem models may inadvertently be quite biased!
Sources of uncertainty 1. Natural variability 2. Observation error 3. Unclear structure of fishery system 4. Outcome uncertainty 5. Inadequate communication
What scientists have done to deal with ... 3. Unclear structure of fishery system 1. Choose single "best" model among alternatives 1a. Informally 1b. Formally using model selection criterion (AICc) Caution!! - Not appropriate for giving management advice- Asymmetric loss functions (Walters and Martell 2004, p. 101)
Asymmetric loss: Which case is preferred? Case 1 Case 2 1.0 SSB / SSBmsy 0.6 0.2 A B A B Species
Spawning favoured Harvest favoured 1.18 1.33 0.68 2.09 0.30 1.44 0.25 0.5 1 2 4 4 2 1.0 0.5 0.25 Preference ratio Fraser River Early Stuart sockeye salmon: Best "management-adjustment" model (H, T, Q, T+Q) Asymmetric with spawning obj. favored Symmetric Asymmetric with harvest obj. favored Recommendation • To develop appropriate indicators, ecosystem scientists should understand asymmetry in managers' objectives, especially given many species. Cummings (2009)
What scientists have done to deal with ... 3. Unclear structure of fishery system 1. Choose single "best" model among alternatives ... ... 1c. Adaptive management experiment - Sainsbury et al. in Australia More commonly, we have to consider a range of alternative models ...
3. Unclear structure of fishery system ... (cont'd.) 2. Retain multiple models; conduct sensitivity analyses 2a. Analyze separately Eastern Scotian Shelf cod (closed in mid-1990s) M values VPA F*1000 Stock- synthesis Delay-diff. (R. Mohn 2009) SSB (thousands of tonnes)
3. Unclear structure of fishery system ... (cont'd.) 2. Retain multiple models; conduct sensitivity analyses 2a. Analyze separately 2b. Combine predictions from alternative models - Unweighted model averaging - Weighted with AIC weights or posterior probab., then calculate expected values of indicators • But weighting assumes managers useexpected value objectives - Many use mini-max objectives (i.e., choose action with lowest chance of worst-case outcome)
0.2 150 0.1 100 0.05 50 0 0 Limit reference point Probability with manage- ment action A Expected SSB (weighted average) 0.05 0 0 1.0 0.25 0.5 0.75 1.25 0.25 0.5 Worst-case outcome (unlikely, but choose action with lowest probability ) SSB/SSBtarget
Recommendation • Ecosystem scientists should work iteratively with managers to find the most useful indicators to reflect management objectives.
3. Unclear structure of fishery system ... (cont'd.) 2. Retain multiple models; conduct sensitivity analyses ... ... 2c. Evaluate alternative ecosystem assessment modelsby using an operating model to determine their statistical properties (e.g., Fulton et al. 2005 re: community indicators)