620 likes | 815 Views
PM Model Performance Goals and Criteria. James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling Meeting Denver, CO May 26, 2004. Outline. Standard Bias and Error Calculations Proposed PM Model Performance Goals and Criteria
E N D
PM Model Performance Goals and Criteria James W. Boylan Georgia Department of Natural Resources - VISTAS National RPO Modeling Meeting Denver, CO May 26, 2004
Outline • Standard Bias and Error Calculations • Proposed PM Model Performance Goals and Criteria • Evaluation of Eight PM Modeling Studies Using Proposed Goals and Criteria • Discussion: Should EPA recommend PM Model Performance Goals and Criteria in PM Modeling Guidance Document?
PM Model Evaluations • Air Quality Modeling and Ambient Measurements are two different ways to estimate actual ambient concentrations of pollutants in the atmosphere • Both modeling and measurements have some degree of uncertainty • Measurements should not be considered the absolute truth • Large differences between monitoring networks due to sampling and analysis techniques • Normalized bias and error calculations should not be normalized by observations, but rather the average of the model and the observations.
Performance Metrics • Mean Normalized Bias and Error • Usually associated with observation-based minimum threshold • Some components of PM can be very small making it difficult to set a reasonable minimum threshold value without excluding a majority of the data points • Without a minimum threshold, very large normalized biases and errors can result when observations are close to zero even though the absolute biases and errors are very small • A few data points can dominate the metric • Overestimations are weighted more than equivalent underestimations • Assumes observations are absolute truth
Performance Metrics • Normalized Mean Bias and Error • Biased towards overestimations • Assumes observations are absolute truth • Mean Fractional Bias and Error • Bounds maximum bias and error • Symmetric: gives equal weight to underestimations and overestimations • Normalized by average of observation and model
Example Calculations • Mean Normalized Bias and Error • Most biased and least useful of the three metrics • Normalized Mean Bias and Error • Mean Fractional Bias and Error • Least biased and most useful of the three metrics
PM Goals and Criteria • Performance Goals: Level of accuracy that is considered to be close to the best a model can be expected to achieve. • Performance Criteria: Level of accuracy that is considered to be acceptable for regulatory applications. • It has been suggested that we need different performance goals and criteria for: • Different Species • Different Seasons • Different Parts of the Country • 20% Haziest and 20% Cleanest Days • Answer: performance goals and criteria that vary as a function of concentration
PM Modeling Studies Used for Performance Benchmarks • SAMI (GT) • July 1995 (URM/IMPROVE/variable grid) • July 1991 (URM /IMPROVE /variable grid) • May 1995 (URM /IMPROVE /variable grid) • May 1993 (URM /IMPROVE /variable grid) • March 1993 (URM /IMPROVE /variable grid) • February 1994 (URM /IMPROVE /variable grid) • VISTAS (UCR/AG/Environ) • July 1999 (CMAQ/IMPROVE/36 km) • July 1999 (CMAQ /IMPROVE/12 km) • July 2001 (CMAQ /IMPROVE/36 km) • July 2001 (CMAQ /IMPROVE/12 km) • January 2002 (CMAQ /IMPROVE/36 km) • January 2002 (CMAQ /IMPROVE/12 km)
PM Modeling Studies Used for Performance Benchmarks • WRAP 309 (UCR/CEP/Environ) • January 1996 (CMAQ/IMPROVE/36 km) • February 1996 (CMAQ/IMPROVE/36 km) • March 1996 (CMAQ/IMPROVE/36 km) • April 1996 (CMAQ/IMPROVE/36 km) • May 1996 (CMAQ/IMPROVE/36 km) • June 1996 (CMAQ/IMPROVE/36 km) • July 1996 (CMAQ/IMPROVE/36 km) • August 1996 (CMAQ/IMPROVE/36 km) • September 1996 (CMAQ/IMPROVE/36 km) • October 1996 (CMAQ/IMPROVE/36 km) • November 1996 (CMAQ/IMPROVE/36 km) • December 1996 (CMAQ/IMPROVE/36 km)
PM Modeling Studies Used for Performance Benchmarks • WRAP 308 (UCR/CEP/Environ) • Summer 2002 (CMAQ/IMPROVE/36 km/WRAP) • Summer 2002 (CMAQ/IMPROVE/36 km/US) • Winter 2002 (CMAQ/IMPROVE/36 km/WRAP) • Winter 2002 (CMAQ/IMPROVE/36 km/US) • EPA (Clear Skies) • Fall 1996 (REMSAD/IMPROVE/36 km) • Spring 1996 (REMSAD/IMPROVE/36 km) • Summer 1996 (REMSAD/IMPROVE/36 km) • Winter 1996 (REMSAD/IMPROVE/36 km)
PM Modeling Studies Used for Performance Benchmarks • MANE-VU (GT) • July 2001 (CMAQ/IMPROVE/36 km) • July 2001 (CMAQ/SEARCH/36 km) • January 2002 (CMAQ/IMPROVE/36 km) • January 2002 (CMAQ/ SEARCH /36 km) • Midwest RPO • August 1999 (CMAQ/IMPROVE/36 km) • August 1999 (CAMx/IMPROVE/36 km) • August 1999 (REMSAD/IMPROVE/36 km) • January 2000 (CMAQ/IMPROVE/36 km) • January 2000(CAMx/IMPROVE/36 km) • January 2000(REMSAD/IMPROVE/36 km)
PM Modeling Studies Used for Performance Benchmarks • EPRI (AER/TVA/Environ) • July 1999 (CMAQ/IMPROVE/32 km) • July 1999 (CMAQ/IMPROVE/8 km) • July 1999 (MADRID/IMPROVE/32 km) • July 1999 (MADRID /IMPROVE/8 km) • July 1999 (CAMx/IMPROVE/32 km)
Proposed PM Goals and Criteria • Based on MFE and MFB calculations • Vary as a function of species concentrations • Goals: MFE +50% and MFB ±30% • Criteria: MFE +75% and MFB ±60% • Less abundant species should have less stringent performance goals and criteria • Continuous functions with the features of: • Asymptotically approaching proposed goals and criteria when the mean of the observed and modeled concentrations are greater than 2.5 mg/m3 • Approaching +200% MFE and ±200% MFB when the mean of the observed and modeled concentrations are extremely small
Proposed Goals and Criteria • Proposed PM Performance Goals • Proposed PM Performance Criteria
Model Performance Zones • Zone I • Good Model Performance • Level I Diagnostic Evaluation (Minimal) • Zone II • Average Model Performance • Level II Diagnostic Evaluation (Standard) • Zone III • Poor Model Performance • Level III Diagnostic Evaluation (Extended) and Sensitivity Testing