1 / 26

Weight of Evidence Checklist Update AoH Meeting – Seattle, WA April 25, 2006

Weight of Evidence Checklist Update AoH Meeting – Seattle, WA April 25, 2006. Joe Adlhoch - Air Resource Specialists, Inc. Review of RHR Visibility Goals. Define current conditions in at each Class I area using the 2000-04 baseline period Define “natural conditions”

Download Presentation

Weight of Evidence Checklist Update AoH Meeting – Seattle, WA April 25, 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Weight of Evidence Checklist Update AoH Meeting – Seattle, WAApril 25, 2006 Joe Adlhoch - Air Resource Specialists, Inc.

  2. Review of RHR Visibility Goals • Define current conditions in at each Class I area using the 2000-04 baseline period • Define “natural conditions” • Improve visibility such that the average Haze Index (measured in DECIVIEW) for the 20% worst days in the baseline period reach “natural conditions” by 2064 • Ensure that visibility on the 20% best days does not degrade • Periodically assess the improvement in visibility between the baseline period and 2064 and show that “reasonable progress” is being achieved

  3. Draft WOE Checklist (Step 1) • Summary of available information • General Class I area information (location, size, topography, discussion of importance, etc.) • Overview summary of basic data sets: • Visibility monitoring • Emission inventories (state and WRAP summaries?) • Modeling results • Will vary according to state (e.g., no CMAQ modeling done for AK; some states have international borders) • Style will be customized by each state

  4. Draft WOE Checklist (Step 2) • Analysis of visibility conditions • What are current (baseline, 2000-04) visibility conditions? • What is the relative importance of each species? • What does the RHR glide path look like? • What are estimated natural visibility conditions? • What does the model predict for 2018?

  5. Baseline Conditions at Agua Tibia Species Contribution Sulfate High Nitrate High Organics Medium EC Medium CM Medium Soil Low

  6. RHR Glide Path for Agua Tibia Model results for the 2018 base case do not predict Agua Tibia’s visibility (in terms of deciview) will be on or below the glide path

  7. Draft WOE Checklist (Step 3) • Analysis of visibility conditions by individual species • What do individual species glide paths (measured in extinction) look like? • Need to define natural conditions appropriately (following examples assume “annual average” natural conditions, not 20% worst) • Which species show predicted 2018 values at or below the glide path?

  8. Species Glide Paths for Agua Tibia symbol represents 2018 model prediction Nitrate, EC, and Soil follow glide path Sulfate, OM, and CM do not follow glide path

  9. Draft WOE Checklist (Step 4) • Review monitoring uncertainties and model performance for each species • What level of monitoring uncertainties are associated with each species? • Lab uncertainties (can be calculated from IMPROVE data set • Other uncertainties (flow rate problems, clogged filters) may be difficult to quantify • How does the model predict the monitoring data? • Good model performance is most important for highest contributing species • What does performance look like seasonally and over all?

  10. Median Laboratory Uncertainty of IMPROVE Data Across WRAP • Uncertainty based only on lab reported uncertainties for daily samples (2000 – 2004) • OC, EC, Soil, and CM uncertainty determined from standard propagation of error analysis on individual component terms • Uncertainty due to flow/size cut errors not included

  11. IMPROVE (top) vs. Model (bottom) How well are the seasonal variations in each species captured, even if the magnitude is off?

  12. 2002 Model Performance, Worst Days Nitrate and Carbon often reasonable Sulfate somewhat low CM shows very poor performance

  13. Draft WOE Checklist (Step 5) • Integrate information about each species: monitoring, modeling, and emissions data • Do changes in emissions agree with model predictions for 2018? • How do we know what source region of emissions to compare? • Weight emissions by back trajectory residence times for an estimate of potential emissions that might be expected to impact a given Class I area • Do weighted emissions described above support attribution results derived from PSAT and PMF?

  14. Baseline Extinction with Lab Uncertainty and Variability PSAT or PMF Attribution Results (Phase I TSSA shown) Inter-Annual Baseline Variability (Mm-1) Baseline Measurement Uncertainty (Mm-1) Predicted 2018 Extinction Sum of Weighted Emissions affecting site Natural Conditions and Glide Path Contributing Source Regions determined by Back Trajectory Residence Times Summary Tables

  15. Inter-Annual Baseline Variability (Mm-1) Baseline Measurement Uncertainty (Mm-1)

  16. Inter-Annual Baseline Variability (Mm-1) Baseline Measurement Uncertainty (Mm-1)

  17. Agua Tibia, CA Total SO2 emissions X residence time = weighted emissions Weighted emissions represent most probable source region emissions which contribute to sulfate at the selected monitoring site.

  18. DRAFT * These examples show the sum of all WRAP region emissions

  19. DRAFT * These examples show the sum of all WRAP region emissions

  20. VERY DRAFT * These examples show the sum of all WRAP region emissions

  21. VERY DRAFT * These examples show the sum of all WRAP region emissions

  22. VERY DRAFT * These examples show the sum of all WRAP region emissions

  23. VERY DRAFT * These examples show the sum of all WRAP region emissions

  24. Draft WOE Checklist (Step 6) • Investigate specific questions that arise in steps 2 – 6 • Review historical trends (if sufficient data exists) • Review distributions of IMPROVE mass, and expected changes predicted by the model • Review natural, episodic events for their potential impact • Do the results so far make sense? If not, deeper investigation of data sets may be required • Are there reasonable explanations for species that show and don’t show progress along the glide path? • Consider the other factors mandated by the RHR to determine reasonable progress

  25. Draft WOE Checklist (Step 7) • Repeat steps 2 – 6 with emissions and model results from various control strategies • How do specific control strategies affect the outcome?

  26. Draft WOE Checklist (Step 8) • Review available attribution information and determine which states need to consult about which Class I areas • PSAT will be available for sulfate and nitrate (and possible some portion of organics) • PMF will be available for all species (?), but may be used primarily for carbon (?) • Emissions weighted by residence times will be available for all species (pending certain sensitivity tests and caveats)

More Related