530 likes | 542 Views
This study evaluates the performance of air quality modeling utilizing the CMAQ and MM5 models, focusing on emission sources and meteorological impacts. Results highlight model accuracy in various regions, including strengths and areas for improvement.
E N D
WRAP 2002 Visibility Modeling:Emission, Meteorology Inputs andCMAQ Performance Evaluation Gail Tonnesen, Bo Wang, Chao-Jung Chien, Zion Wang, Mohammad Omary University of California, Riverside Zac Adelman, Andy Holland University of North Carolina Ralph Morris et al. ENVIRON Corporation Int., Novato, CA WRAP Attribution of Haze Meeting, Denver, CO July 22, 2004
Annual MM5 Simulations run at the RMC Emissions processed with SMOKE Preliminary 2002 Scenario C used here. CMAQ version 4.3 (released October 2003) Data summaries, QA, results are posted on the RMC web page: www.cert.ucr.edu/aqm/308 Summary of RMC 2002 Modeling
MM5 Modeling Domain (36 & 12 km) • National RPO grid • Lambert conic Projection • Center: -97o, 40o • True lat: 33o, 45o • MM5 domain • 36 km: (165, 129, 34) • 12 km: (220, 199, 34) • 24-category USGS data • 36 km: 10 min. (~19 km) • 12 km: 5 min. (~9 km)
Subdomains for 36/12-km Model Evaluation 1 = Pacific NW 2 = SW 3 = North 4 = Desert SW 5 = CenrapN 6 = CenrapS 7 = Great Lakes 8 = Ohio Valley 9 = SE 10 = NE 11 = MidAtlantic
Evaluation Review • Evaluation Methodology • Synoptic Evaluation • Statistical Evaluation using METSTAT and surface data • WS, WD, T, RH • Evaluation against upper-air obs • Statistics: • Absolute Bias and Error, RMSE, IOA (Index of Agreement) • Evaluation Datasets: • NCAR dataset ds472 airport surface met observations • Twice-Daily Upper-Air Profile Obs (~120 in US) • Temperature • Moisture
METSTAT Evaluation Package • Statistics: • Absolute Bias and Error, RMSE, IOA • Daily and, where appropriate, hourly evaluation • Statistical Performance Benchmarks • Based on an analysis of > 30 MM5 and RAMS runs • Not meant as a pass/fail test, but to put modeling results into perspective
Evaluation of 36-km WRAP MM5 Results • Model performed reasonably well for eastern subdomains, but not the west (WRAP region) • General cool moist bias in Western US • Difficulty with resolving Western US orography? • May get better performance with higher resolution • Pleim-Xiu scheme optimized more for eastern US? • More optimization needed for desert and rocky ground? • MM5 performs better in winter than in summer • Weaker forcing in summer • July 2002 Desert SW subdomain exhibits low temperature and high humidity bias 2002 MM5 Model Evaluation 12 vs. 36 km Results Chris Emery, Yiqin Jia, Sue Kemball-Cook, and Ralph Morris (ENVIRON International Corporation) & Zion Wang (UCR CE-CERT), Western Regional Air Partnership (WRAP) National RPO Meeting, May 25, 2004
WRAP 36km/12km July Wind Performance Comparison 120 100 80 Wind Direction Error (degrees) 60 PacNW SW DesertSW 40 North 20 0 0 0.5 1 1.5 2 2.5 3 3.5 Wind Speed RMSE (m/s) Benchmark 12 km Subdomains MM5/RAMS Runs 36 km Subdomains
The RMC is continuing to test alternative MM5 configurations – to be completed at the end of 2004. Expect some reduction in bias &error in the WRAP states, however even in the best case we will have error & bias in MM5 that must be considered when using CMAQ for source attribution. MM5 Implications for AoH
Preliminary 2002 Scenario C based on the 1996 NEI, grown to 2002, with many updates by WRAP contractors and other RPOs. Processed for CMAQ using SMOKE. Extensive QA plots on the web page Both SMOKE QA and post-SMOKE QA Emissions Inventory Summary
WRAP 2002 Annual NOx Emissions Area Biogenic On Road Non Road Road Dust Point Rx Fire Ag Fire Wildfire Offshore
2002 WRAP NOx Emissions by Source & State Ag Fire 1400000 Rx Fire 1200000 Wildfire 1000000 Area [Tons/Yr] Point 800000 Nonroad 600000 Onroad 400000 200000 Utah Idaho 0 Oregon Nevada Arizona Montana Wyoming Colorado California Washington New Mexico North Dakota South Dakota
WRAP 2002 Annual SO2 Emissions Area Biogenic On Road Non Road Road Dust Point Rx Fire Ag Fire Wildfire Offshore
Onroad 2002 WRAP SO2 Emissions by Source & State Ag Fire 3.00E+05 Rx Fire Wildfire 2.50E+05 Area Nonroad 2.00E+05 Point [Tons/Yr] 1.50E+05 1.00E+05 5.00E+04 0.00E+00 Utah Idaho Oregon Nevada Arizona Montana Wyoming Colorado California Washington New Mexico North Dakota South Dakota
2002 WRAP NH3 Emissions by Source Category 2.50E+05 Nonroad Ag Fire 2.00E+05 Rx Fire Point 1.50E+05 Onroad Tons/Yr Wildfire Area 1.00E+05 5.00E+04 0.00E+00 Nevada Utah Idaho Oregon Arizona Montana Wyoming Colorado California Washington New Mexico North Dakota South Dakota
Preliminary 2002 EI Used here. Updates for final 2002 EI will include: New EI data from other RPOs and Canada 2002 NEI to replace grown 1996 NEI Reprocess in SMOKE with final MM5 All final inputs ready now except Canada & MM5 Emissions Summary
CMAQ v4.3 36-km grid, 112x148x19 Annual Run CB4 chemistry Evaluated using: IMPROVE, CASTNet, NADP, STN, AIR/AQS CMAQ Simulations
Guidance from EPA not yet ready: Difficult to assert that model is adequate. Therefore, we use a variety of ad hoc performance goals and benchmarks to display CMAQ results. We completed a variety of analyses: Compute over 20 performance metrics Scatter-plots & time-series plots Soccer plots Bugle plots PM Performance Criteria
We completed a variety of analyses: Compute over 20 performance metrics Scatter-plots & time-series plots Soccer plots Bugle plots Goal is to decide whether we have enough confidence to use the model for AoH: Is this a valid application of the model? Goal of Model Evaluation
Plot error as as a function of bias. Ad hoc performance goal: 15% bias, 35% error based on O3 modeling goals. Larger error & bias are observed among different PM data methods and monitoring networks. Performance benchmark: 30% bias, 70% error (2x performance goals) PM models can achieve this level in many cases. Soccer Goal Plots
Spring Summer Fall Winter
Spring Summer Fall Winter
Spring Summer Fall Winter
Spring Summer Fall Winter
Performance Goals and Criteria- Proposed by Jim Boylan • Based on FE and FB calculations • Vary as a function of species concentrations • Goals: FE +50% and FB ±30% • Criteria: FE +75% and FB ±60% • Less abundant species should have less stringent performance goals and criteria
Performance Goals and Criteria- Proposed by Jim Boylan • PM Performance Goals • Proposed PM Performance Criteria
TSSA results are run in CMAQ v4.4 with emissions version Preliminary 2002 C Performance evaluation used CMAQ 4.3 Previous CMAQ runs used CMAQ 4.3 with Preliminary 2002 B emissions (no fires) CMAQ & EI Versions
CMAQ v4.3 Mean fractional bias (no filter) January +25% MFB July –20% mean MFB Slightly worse January O3 performance in v4.4 CMAQ Ozone Performance