1 / 47

Environmental Management:

Environmental Management: Risk Assessment, Multi-Criteria Decision Analysis, and Adaptive Management Techniques for Addressing Model Uncertainty and Reliability. Igor Linkov & Pat Deliman US Army Engineer Research and Development Center Environmental Laboratory

mercer
Download Presentation

Environmental Management:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Environmental Management: Risk Assessment, Multi-Criteria Decision Analysis, and Adaptive Management Techniques for Addressing Model Uncertainty and Reliability Igor Linkov & Pat Deliman US Army Engineer Research and Development Center Environmental Laboratory Igor.Linkov@usace.army.mil, Phone: 617-233-9869 Patrick.N.Deliman@usace.army.mil, Phone: 601-634-3623

  2. Adaptive Risk-Based Planning

  3. Risk-based Planning: Top-down Drivers Agencies need to relate response to the mission goals and track the progress and performance • The Government Performance and Results Act (GPRA) • “provide for the establishment of strategic planning and performance measurement in the Federal Government” (OMB, 1993). • embodied a push for better planning, greater accountability, and straightforward performance evaluation in government. • OMB Program Assessment Rating Tool (PART) • rates the performance of a program through a series of yes/no questions • Scores on four primary areas: program purpose & design, strategic planning, management, and results & accountability. • performance metrics used by the program are essential to PART.

  4. Risk-based Planning: Bottom-up Drivers Local communities need to understand actions by the Agencies and like to see their values accounted for • For stakeholders, the root issue is: fear of becoming a victim to (uncompensated) loss • Layperson: Risk = Hazard x Perception • Expert: Risk = Hazard x Exposure x Consequence • Core concerns tend to be: trust, control, process, information and timing

  5. Coastal Louisiana Restoration Planning:What Questions are We Trying to Answer? • For taking action at varying scales, what is the cost and risk reduction? • For taking action at varying scales, what are the adverse impacts to significant resources? • How do youdecide what actions to take? • How much information is necessary to make decisions? • What are the flood and storm threats to coastal Louisiana/Mississippi? • What do we have to lose and how vulnerable are we? • What should be our planning timeframe? • What can be done to reduce risks for our planning timeframe? • How is the public, agencies, and others involved? Risk and/or Uncertainty elements are present almost in every question After E. Russo

  6. Information and Planning/Decision Cycles Information gathering and decision-making are two separate cycles in environmental management Modeling/Software/GIS… Technology-based Fix in Information Age After Roman, 1996 Integration – Need for Revolutionary Changes

  7. Main Points • Risks and benefits associated with alternative management strategies are difficult to quantify. • Model, Parameters and Scenario uncertainty and variability associated with predicting efficiency of management options as well as stakeholder value judgment are important to consider • Challenges of risk assessment and planning for situations with a limited knowledge base and high uncertainty and variability require coupling traditional risk assessment and planning with multi-criteria decision analysis (MCDA) to support regulatory decision making

  8. Presentation - Overview • Risk-based Planning: Top-down and Bottom-up Drivers • Risk and Uncertainty • Traditional Way of Dealing with Uncertainty • Need for Formal Decision Analysis • MCDA -Summary • Example: • MCDA Use to Select Performance Metrics for Oil Spill Response Planning • RA/MCDA Application for Sediment Management • Conclusion • References

  9. AD HOC Process Quantitative? Qualitative? Risk Analysis Modeling / Monitoring Cost or Benefits Stakeholders’ Opinion Tools Current Decision-Making Processes Decision-Maker(s) • Include/Exclude? • Detailed/Vague? • Certain/Uncertain? • Consensus/Fragmented? • Iterative? • Rigid/unstructured? Challenge: Multiple & Uncertain Criteria

  10. Challenges to Complex Decision-making • “Humans are quite bad at making complex, unaided decisions” (Slovic et al., 1977). • Individuals respond to complex challenges by using intuition and/or personal experience to find the easiest solution. • At best, groups can do about as well as a well-informed individuals if the group has some natural systems thinkers within it. • Groups can devolve into entrenched positions resistant to compromise • “There is a temptation to think that honesty and common sense will suffice” (IWR-Drought Study p.vi)

  11. Problem: Model Uncertainty • Model Uncertainty • Differences in model structure resulting from: • model objectives • computational capabilities • data availability • knowledge and technical expertise of the group • Can be addressed by • considering alternative model structures • weighting and combining models • Eliciting expert judgment Mechanistic models for environmental risk assessment are very uncertain and expert judgment is required

  12. Problem: Parameter Uncertainty • Parameter Uncertainty • Uncertainty and variability in model parameters resulting from • data availability • expert judgment • empirical distributions • Can be addressed by • Probabilistic Simulations (Monte-Carlo) • Analytical techniques (uncertainty propagation) • Expert estimates Many parameters and factors important for risk assessment are not well known, reported ranges are large and often unquantifiable

  13. Problem: “Modeler/Scenario Uncertainty” Subjective Interpretation of the Problem at Hand What is the relative influence of modeler perception on model predictions?

  14. Problem: “Modeler/Scenario Uncertainty” Subjective Interpretation of the Problem at Hand What is the relative influence of modeler perception on model predictions?

  15. Multi-Criteria Decision Analysis and Tools • Multi-Criteria Decision Analysis (MCDA) methods: • Evolved as a response to the observed inability of people to effectively analyze multiple streams of dissimilar information • Many different MCDA approaches based on different theoretical foundations (or combinations) • MCDA methods provide a means of integrating various inputs with stakeholder/technical expert values • MCDA methods provide a means of communicating model/monitoring outputs for regulation, planning and stakeholder understanding • Risk-based MCDA offers an approach for organizing and integrating varied types of information to perform rankings and to better inform decisions

  16. Decision Analytical Frameworks • Agency-relevant/Stakeholder-selected • Currently available software • Variety of structuring techniques • Iteration/reflection encouraged • Identify areas for discussion/compromise Decision Integration Tool Integration Risk Analysis Modeling / Monitoring Cost Stakeholders’ Opinion Sharing Data,Concepts and Opinions Evolving Decision-Making Processes Decision-Maker(s)

  17. Simplified Decision Matrix

  18. How to combine these criteria? How to interpret these results? How to compare these alternatives? Example Decision Matrix

  19. Decision Analysis Methods and Tools

  20. Linking RA, AM and MCDA Adaptive Management Problems Alternatives Criteria Evaluation MCDA Feeds RA Decision Matrix Weights RA Synthesis RA Feeds MCDA Decision MCDA

  21. Risk Informed Decision Framework: Restoration Planning for Coastal LA and MS

  22. Example 1: Performance Metrics for Oil Spills Response Planning* • Framework for selecting metrics • Multiple stakeholders • Agencies (federal, state, local) • Responsible parties • Local residents • NGOs (business, environmental, etc.) • Integrate deliberation and science to link goals, objectives, metrics, and measures • Compatible with existing planning, decision-making, and assessment processes • Completed as part of preparedness planning *based on Linkov, Seager, Figueira, Tkachuk, Levchenko, Trevonnen (2007), funding provided by NOAA through CRRC, UNH.

  23. Examples (oil spills response) • Endpoint • Miles of shoreline impacted or cleaned vs. areas protected (e.g., by redirecting or containing oil). • Number of fish, birds or other wildlife killed or injured (per unit search area). • Number of “appropriate” (not exotics) animals rehabilitated and released. • Degree of change to beaches and sandbars from clean-up actions. • Types of animals and vegetation present after spill cleanup. • Process • Did getting required permits delay response action? • Rate of bird handling at rehabilitation center. • Time to deploy booming and double-booming in sensitive areas. • Resource • Amount of oil containment boom deployed. • Number of volunteers deployed. • Number sandbags deployed.

  24. Challenges • Challenges to defining “good” metrics • What is most relevant may be very difficult to measure. • Metrics may be indirect measures of what people really care about. • What is easy to measure may not be relevant to what people care about. • There can be disagreements about thresholds to differentiate “good” versus “bad.” • Accuracy and reliability of data recording is a challenge. • Paucity of baseline data. • Timing of measurement can affect assessment of performance. • Difficult to communicate to the public – or at least that is managers’ perception. • Weightings and aggregation.

  25. Characteristics of Good Measures • scientifically verifiable • cost-effective • easy to communicate to a wide audience • relevant to what people care about • decision or action relevant • credible • scalable over an appropriate time period and geographic region • sensitive to change

  26. Oil Spill Response Metrics Taxonomy by Type of Information Measured

  27. Assessment Criteria

  28. Metric Assessment by Criteria

  29. Criteria Weight

  30. Rank Acceptability Analysis

  31. Pairwise Metrics Domination

  32. Sensitivity Analysis ENVIRONMENTAL Relevance Most Important COST Most Important

  33. Example 2: NY/NJ Harbor

  34. Example: NY/NJ Harbor Issues • Harbor among most polluted in U.S. • >106 yd3 fail regional criteria for ocean disposal • Existing disposal site closed 1 Sep. 97 • Proposed deepening

  35. Proof of Concept Study Objectives Integrate comparative risk assessment results with cost and stakeholder decision criteria Use decision criteria/performance measures from published data and proposed costs Test decision tools, methodology and results Set contaminated sediment management options Set decision criteria/performance measures Software - Criterium DecisionPlus Stakeholder Values / Expert Surveys USACE/EPA dredged material managers meetings (New Orleans 2004) SRA/USACE/Contaminated Sediments Meeting (Palm Beach 2004) Example: Decision Methodology

  36. Dredged Material Effluent Manufactured Liner Cap Standard Landfill Waste Conceptual Illustration of Disposal Alternatives Manufactured Soil Cement Lock Landfill Upland CDF Nearshore CDF CAD Pit No-Action Island CDF Water Line In-place Soil KEY: In-place Sediment Kane Driscoll, S.B., W.T. Wickwire, J.J. Cura, D.J. Vorhees, C.L. Butler, D.W. Moore, T.S. Bridges. 2002. A comparative screening-level ecological and human health risk assessment for dredged material management alternatives in New York/New Jersey Harbor. International Journal of Human and Ecological Risk Assessment 8: 603-626. Dike Wall

  37. Cost Public Acceptance Human Health Ecological Health # of complete ecological exposure pathways # of complete human exposure pathways $ / Cubic Yard Impacted Area / Capacity Largest Ecological Hazard Quotient (HQ) calculated for any one pathway Largest Cancer Risk calculated for any one pathway Estimated Fish COC Concentration / Hazard Level Source: NY/NJ Dredged Material Management Plan and Expert Opinion Source: Kane Driscoll et al. (2002). Decision Criteria: NY/NJ Harbor Contaminated Sediment Management Decision

  38. NY/NJ Harbor in Criterium DecisionPlus Goal Criteria Sub-Criteria

  39. Alternatives Rating Technique: SMART with linear value functions Hierarchy Rating Technique: Weights NY/NJ Harbor in Criterium DecisionPlus Goal Criteria Sub-Criteria Alternatives

  40. Cost Public Acceptability Ecological Risk Human Health Risk ($/CY) Impacted Area/Capacity (acres / MCY) Ecological Exposure Pathways Magnitude of Ecological HQ Human Exposure Pathways Magnitude of Maximum Cancer Risk Estimated Fish COC / Risk Level CAD 5-29 4400 23 680 18 2.8 E -5 28 Island CDF 25-35 980 38 2100 24 9.2 E -5 92 Near-shore CDF 15-25 6500 38 900 24 3.8 E -5 38 DM Alternatives Upland CDF 20-25 6500 38 900 24 3.8 E -5 38 Landfill 29-70 0 0 0 21 3.2 E –4 0 No Action 0-5 0 41 5200 12 2.2 E –4 220 Cement-Lock 54-75 0 14 0.00002 25 2.0 E -5 0 Manufactured Soil 54-60 750 18 8.7 22 1.0 E –3 0 Criteria Levels for Each DM Alternative Blue Text: Most Acceptable Value Red Text: Least Acceptable Value

  41. Attribute Swung from Worst to best Consequence to compare Rank (1-9) Rate (0-100) Benchmark: Worst case on everything Impacted Area/Capacity of Facility = 6500 (acres/ 106 cubic yards) Magnitude of Ecological Hazard Quotient – Maximum Exposure = 5200 Number of Complete Ecological Exposure Pathways = 41 Number of Complete Human Exposure Pathways = 25 Magnitude of Maximum Cancer Probability (Non-barge worker) = 1* 10-3 Ratio of Estimated Concentration of COCs in Fish to Risk-Based Concentrations = 220 Cost = 54-75 $/CY 9 0 Impacted Area/Capacity of Facility Change from 6500 (acres/ 106 cubic yards) to 0 (acres/ 106 cubic yards) Magnitude of Ecological Hazard Quotient –Maximum Exposure Change from 5200 to 0 Number of Complete Ecological Exposure Pathways Change from 41 to 0 Number of Complete Human Exposure Pathways Change from 25 to 12 Magnitude of Maximum Cancer Probability (Non-barge worker) Change from 1* 10-3 to 0.028 * 10-3 Ratio of Estimated Concentration of COCs in Fish to Risk-Based Concentrations Change from 220 to 0 Cost Change from (54-75 $/CY) to (0-5 $/CY) USACE/EPA DM Managers Meeting: NY/NJ Harbor Weighting Form

  42. USACE/EPA Survey Results: Criteria Weights (%)

  43. 0.8 0.8 USACE weighting 0.6 0.6 0.4 0.4 Cost Maximum Cancer Probability (Non-Barge Worker) 0.2 0.2 Ecological Hazard Quotient 0.0 0.0 Est. COC Conc in Fish / Risk-based Conc Complete Human Health Exposure Pathways Complete Ecological Exposure Pathways Ratio of Impacted Area to Facility Capacity EPA weighting 0.8 0.8 0.6 0.6 0.4 0.4 Cost Maximum Cancer Probability (Non-Barge Worker) 0.2 0.2 Ecological Hazard Quotient 0.0 0.0 Est. COC Conc in Fish / Risk-based Conc Complete Human Health Exposure Pathways Complete Ecological Exposure Pathways Ratio of Impacted Area to Facility Capacity Criteria Contributions to Decision Score

  44. Solution vs. MCDA Method:Does it Matter?

  45. Risk Assessment, Adaptive Management and MCDAImplementation Framework

  46. Risk-Based MCDA offers planners: Reproducible and defensible management of complex multiple criteria A means to define and gauge what is important Balancing of expert opinion and stakeholder values Better responses, better reporting with opportunitiesto more clearly get it “out on the table” Key Take-Away Points

  47. MCDA workshop sites with posted lectures– http://www.risktrace.com/sediments http://www.risktrace.com/nato http://www.risk-trace.com/ports/index.php Papers Yatsalo, B., Kiker, G., Kim, J., Bridges, T., Seager, T., Gardner, K., Satterstrom, K., Linkov, I. 2006. Application of Multi-Criteria Decision Analysis Tools for management of contaminated Sediments. Integrated Environmental Assessment and Management. Seager, T., Satterstrom, K., Linkov, I., Tuler, S., Kay, R. 2006. Typological Review of Environmental Performance Metrics (with Illustrative Examples for Oil Spill Response). Integrated Environmental Assessment and Management. Linkov, I., Satterstrom, K., Kiker, G., Bridges, T., Benjamin, S., Belluck, D. (2006). From Optimization to Adaptation: Shifting Paradigms in Environmental Management and Their Application to Remedial Decisions. Integrated Environmental Assessment & Management 2:92-98. Linkov, I., Satterstrom, K., Seager, T.P., Kiker, G., Bridges, T., D. Belluck, A. Meyer (2006). "Multi-Criteria Decision Analysis: Comprehensive Decision Analysis Tool for Risk Management of Contaminated Sediments". Risk Analysis 26:61-78. Linkov, I., Satterstrom, K., Kiker, Batchelor, C., G., Bridges, T.(2006). From Comparative Risk Assessment to Multi-Criteria Decision Analysis and Adaptive Management: Recent Developments and Applications. Environment International 32: 1072-1093. References

More Related