1 / 19

Robert W. Pinder, Alice B. Gilliland, Robert C. Gilliam, K. Wyat Appel

Evaluating uncertainty predictions using an ensemble of CMAQ model configurations. Robert W. Pinder, Alice B. Gilliland, Robert C. Gilliam, K. Wyat Appel Atmospheric Modeling Division, NOAA Air Resources Laboratory, in partnership with USEPA National Exposure Research Laboratory

dalford
Download Presentation

Robert W. Pinder, Alice B. Gilliland, Robert C. Gilliam, K. Wyat Appel

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating uncertainty predictions using an ensemble of CMAQ model configurations Robert W. Pinder, Alice B. Gilliland, Robert C. Gilliam, K. Wyat Appel Atmospheric Modeling Division, NOAA Air Resources Laboratory, in partnership with USEPA National Exposure Research Laboratory 2007 CMAS Conference October 2, 2007

  2. Objective Uncertainty: what is the likelihood that the observed value is within a given range? Applications: Exposure studies Diagnostic evaluation tool One measure of uncertainty is the error when the model is compared with observations. Can we use an ensemble approach to make a better estimate of this range? Histogram of CMAQ O3 Model Error 8-hour max O3, 228 AQS Sites from the SE US

  3. Sources of Uncertainty Structural Uncertainty: VOC species lumping Physical processes Approach: vary representation Parameter Uncertainty: Emissions Meteorology Chemical rate constants Approach: Monte Carlo methods Challenge: Monte Carlo methods are not feasible given CMAQ’s computational requirements.

  4. Method • Variety of CMAQ / MM5 model configurations • Direct sensitivity calculations • Use observations to remove spurious ensemble members

  5. Planetary Boundary Layer / Land Surface Model Pleim-Xiu Land Surface Model; ACM: Asymmetric Convective Model (Pleim and Chang, 1992) Miller-Yamada-Janjic (Janjic, 1994), NOAH Land Surface Model Medium Range Forecast (Hong and Pan, 1996), NOAH Land Surface Model Chemical Mechanism Carbon Bond IV SAPRC-99 Six structural uncertainty cases Generate Ensemble Members for Structural Uncertainty using Multiple Model Configurations

  6. Generate Ensemble Members for Parametric Uncertainty using Direct Sensitivity Calculation • Use the Direct Decoupled Method (DDM)to calculate sensitivity to: • NOx Emissions • VOC Emissions • Second-order sensitivity • O3 Boundary conditions At each grid cell, calculate ozone response to emissions and boundary concentrations Comparedto brute-force calculation, errors are 5-10% (Cohan et al., 2005)

  7. Direct Calculation of Ozone SensitivityJuly 16, 2002, 2 PM O3 (ppb) >30 25 20 15 10 5 0 -5

  8. Use Observations to Constrain Ensemble AQS O3 Monitoring Sites Used to evaluate boundary conditions Used to evaluate ensemble quality

  9. Structural uncertainty simulations (6) Use DDM to calculate O3 sensitivity to NOx, VOC, and boundary conditions. Randomly sample from range of uncertain NOx emissions, VOC emissions, boundary concentrations, and structural uncertainty cases Use observations to remove spurious ensemble members Repeat 200 times Generate an ensemble member by calculating the O3 field across SE US domain

  10. Example: Atlanta, GeorgiaJuly 1-28, 2002

  11. Structural Uncertainty

  12. Structural + Parametric Uncertainty Spread is large – can we use the observations to narrow this range?

  13. Prune ensemble members not consistent with observations Remove ensemble members that do not constrain the range

  14. Pruned ensemble has narrow range while still including observations 200 member ensemble 10 member ensemble

  15. Compare with +/-30% ±30% of base case CMAQ 10 member ensemble Range is 40% lower

  16. Analysis at all sites AQS O3 Monitoring Sites Dataset: • 38 locations, 28 days • 1064 observations Evaluation: • Randomly reserve 50% of dataset • Derive ensemble, prune using half of observations • Evaluate using the reserve dataset • Ensemble range includes 85% of observations • Range is 40% smaller than ±30% of base case

  17. Trade-off between coverage of observations and spread in range

  18. Conclusions • This ensemble generation and pruning technique provides a more robust uncertainty range: • Observed value is within the range 85% of the time • 40% reduction in spread compared to +/- 30% rule • Simultaneously narrowing these bounds and improving the performance depends on reducing structural errors in CMAQ • Locations and times that fall outside of the ensemble range should be targeted for uncovering structural errors in the model

  19. Acknowledgements:Sergey Napelenok, Jenise Swall, Kristen Foley DISCLAIMER: The research presented here was performed under the Memorandum of Understanding between the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Commerce's National Oceanic and Atmospheric Administration (NOAA) and under agreement number DW13921548. This work constitutes a contribution to the NOAA Air Quality Program. Although it has been reviewed by EPA and NOAA and approved for publication, it does not necessarily reflect their policies or views.

More Related