340 likes | 468 Views
Adaptive management – motivation and principles An overview for the Minnesota Grasslands Management Workshop (FWS-USGS Adaptive Management Consultancy). Clint Moore, USGS Patuxent Wildlife Research Center. Organization of presentation. Wildlife management is decision making
E N D
Adaptive management – motivation and principlesAn overview for theMinnesota Grasslands Management Workshop(FWS-USGS Adaptive Management Consultancy) Clint Moore, USGS Patuxent Wildlife Research Center
Organization of presentation • Wildlife management is decision making • A management case study: prairie restoration • Customary approaches to management • An adaptive approach • How do the approaches compare? • Criteria of all AM applications
Wildlife management is decision making • Populations, habitats, people • Almost always under uncertainty • But whose wildlife training included principles of formal decision making?
UncertaintyDoesn’t make life (decision making) easier! Structured Decision Making • Partial Observability • Inability to accurately see or measure system • Partial Controllability • Indirect control; realized action differs from intended • Environmental Variation • “Randomness” around expected mean response • Structural Uncertainty • System behavior is unknown or disputed Adaptive Management
Adaptive Management – A management / science partnership • Usual relationship: • Science provides information; management acts on it • No further interaction beyond this transfer of info • … and this is a problem, why? • When the decision is not properly structured, it often leads to misdirection and management paralysis • Displacement behavior: always need “more information”, “more research”, “more monitoring” Bolsters public view of science as a never-ending and mostly useless exercise • AM integrates science and management • Science helps predict how system will respond to actions • Information focus is on what is needed to reduce uncertainty
Prairie restoration case study • Objective: • Achieve as much annual growth as possible for a target forb (through reduction of competition) on a restoration area • Do this in a cost-effective way each spring • 2 decision alternatives: Mowing or burning This year (0) +2 +4 … Next year (+1) +3
Prairie restoration case study • Uncertainty about treatment • Both treatments are known to be effective, but is burning any more effective than mowing? (Structural Uncertainty) • Is average effect 10% more effective than mowing? 20%? 0%? • No matter the average difference, the real difference any one year is unpredictable (Environmental Variation) • What decision should be made? • If cost was not an issue, we would prefer to burn every time • Provides additional ecological benefits not provided by mowing • But cost is an issue • Burning is far more expensive than mowing
Customary approaches to management under uncertainty • Strategies that sidestep uncertainty • Assertion: Uncertainty doesn’t exist • Uncertainty judged inconsequential: Uncertainty exists, but we decide it’s not meaningful in context of decision • Risk-aversive decision making: Uncertainty exists, but choose decision to minimize chance of worst possible outcome • Risk of • Really bad decisions • Controversy about decision process (inquiries, litigation)
Customary approaches to management under uncertainty • Trial and error: try something and see how it works • Outcome is… • Favorable – repeat the decision next time • Unfavorable – try something else
10 Adaptive Management Consultancy – Feb 2006 (conference call)
11 Outcome: burning better than mowing Burning established as "best" option Next year Next year This year: Try both Outcome: burning not better than mowing Mowing established as "best" option Adaptive Management Consultancy – Feb 2006 (conference call)
Customary approaches to management under uncertainty • Trial and error: try something and see how it works • Outcome is… • Favorable – repeat the decision next time • Unfavorable – try something else • Learning is informal and accidental, even illusory: • Hard to make sense of chance events that obscure outcome • No contingency for ever challenging a “best” (traditional) decision • No means of reconciling contradictory experiences
Customary approaches to management under uncertainty • Experimentation • Actions designed to resolve uncertainty as quickly as possible
14 Adaptive Management Consultancy – Feb 2006 (conference call)
15 Year 1 Year 2 Year 3 Adaptive Management Consultancy – Feb 2006 (conference call)
Customary approaches to management under uncertainty • Experimentation • Actions designed to resolve uncertainty as quickly as possible • Direct focus is on resolving uncertainty, not improving management • Experimentation may be costly, impractical, or infeasible • Management returns may be put on hold while experiment is conducted • Quick reduction of uncertainty may impose too much risk to resource
An adaptive approach • Designed to… • Indicate good decisions in face of uncertainty • Make use of decision outcomes to reduce uncertainty • Requires… • Objective statement • Set of decision alternatives • Competing, predictive models of decision outcome • Models link decisions, outcomes, and objective • To describe uncertainty & provide basis for reducing it • Measures of confidence on each model • Reflects current degree of influence on decision by each model • Program to monitor response • To update confidence measures (reduce uncertainty)
What do we want out of management?The objective statement • A subjective value placed on each outcome of each decision (e.g., scale of 0 – 10)
What are we uncertain about?The role of competing predictive models
What are we uncertain about?The role of competing predictive models • AM requires specifying alternative, plausible models • They serve as alternative hypotheses about management • They reflect breadth of uncertainty about management among decision makers and stakeholders (i.e., “bounding uncertainty”) • Inclusive feature of AM: stakeholder beliefs are admitted and then evaluated on a level, transparent playing field • But AM doesn't require that one model be selected or declared a "winner" • Instead, models are allocated and de-allocated influence over time
How do we measure uncertainty?Model confidence weights • Numbers (proportions adding to 1.0) are assigned to each model • Example: • We believe that chances are about 50/50 that burning is any better than mowing • If burning is better than mowing, we suppose chances are 2:1 that improvement is only moderate (i.e., 10% better) • Possible initial assignment of confidence weights: • Model 1 (no difference) 0.50 • Model 2 (burning 10% better) 0.33 • Model 3 (burning 20% better) 0.17
How do we measure uncertainty?Model confidence weights • The best decision under uncertainty emerges when confidence weights are combined with objective values • Weights of (0.50, 0.33, 0.17) favor the mowing decision but do not exclude the burning decision • Other weight assignments could be chosen • Each choice influences how likely each action is chosen or how often each action is represented
23 12 4 Adaptive Management Consultancy – Feb 2006 (conference call)
How do we gain knowledge and adapt?The monitoring program • Following application of treatments, collect data on the response
How do we gain knowledge and adapt?The monitoring program • Using a simple probability formula, model weights are updated based on support by the data for each alternative model • Observed difference in means: • Burning 12% greater than mowing (95% CI: -6% - 25%)
12 10 4 6 How do we gain knowledge and adapt?The monitoring program This year Next year
What happens next? • Cycle of decision making, prediction, data collection, and updating is continued • Management "adapts" as information is collected and knowledge is gained • Possible improvements for this example • Incorporate measurement of a "state variable" (e.g., soil moisture) to make smarter judgments about use of fire vs mowing greater control over environmental variation • Implement at multiple sites to increase experience with treatments over broader conditions greater control over environmental variation • Incorporate objectives other than vegetation growth and cost
AM compared to customary management • Trial-and-error • AM puts in place a decision and learning structure that • Is transparent • Resolves ambiguous or contradictory decision outcomes • Accommodates unexpected outcomes, surprises • Provides a formal record of management • Experimentation • AM maintains focus on management objectives • Decisions chosen to maximize objectives, not merely to return information • Arbitrary "significance" thresholds are not required (nor are they desired) under AM: AM can proceed in cases where the experiment returns an ambiguous "not significant" outcome • But, most effective when combined with good science design: • Randomization, replication, control
Some applications of adaptive management • Adaptive Harvest Management of waterfowl • Objective: Maximize cumulative harvest • Principal uncertainties: Population response to harvest, relationship between regulations and harvest rates • Monitoring data: Numbers of breeding waterfowl and habitat condition in spring • Pine harvest management for RCW • Objective: Maintain supply of old-growth forest through timber harvest • Principal uncertainty: Rates of pine succession to hardwood • Monitoring data: Forest composition in pine age classes and in hardwood • R5 Impoundment Study • Objective: Create seasonal wetland habitat for migrating shorebirds • Principal uncertainty: Effects of drawdown timing and rate of drying on bird use • Monitoring data: Pond hydrography, vegetation, bird abundance
Criteria of all AM applications • A sequential decision must be made • Affecting a single resource or applied to multiple units • Series of one-time decisions, e.g., restoration projects
time Making a sequential decision • Situation 1: Control of a dynamic resource Single population: harvests of deer, releases of condors Multiple units: prescribed burning of forest compartments Population Population Population Population Decision Decision Decision Decision
time Making a sequential decision • Situation 2: Series of replicated, one-time decisions Examples: Dam removals, mine restorations Site A Site B Site C Site D Site E Site F Site G
Criteria of all AM applications • A sequential decision must be made • A clear, measurable objective is (or can be) stated • Manager is faced with real decision alternatives • None that are politically or practically implausible • Decisions aren't just "tweaks" of a default action • A key uncertainty stands in the way • Litmus test: If I knew the true state of things, would it make a difference in the action I take? • A way to predict outcomes for different actions • Each hypothesis represented by a unique model • A way to test those predictions • A focused monitoring program can be put in place
A few references • Adaptive Management Guidebook for the Department of Interior (2007) • Nichols and Williams (2006) Trends in Ecology and Evolution 21:668-673 • Gregory et al. (2006) Ecological Applications 16:2411-2425 • Schreiber et al. (2004) Ecological Management and Restoration 5:177-182 • Williams et al. (2002) Analysis and Management of Animal Populations (Academic Press) • Walters (1986) Adaptive Management of Renewable Resources (McGraw-Hill)