230 likes | 360 Views
Refinement of the 16-20 Sept ‘00 Modeling Episode: Part II—Performance Improvement. Central California Ozone Study: Modeling Support for 16-20 Sept ’00 Episode Kickoff Meeting Sacramento, CA T. W. Tesche 24 October 2003. Part II: Presentation Overview.
E N D
Refinement of the 16-20Sept ‘00 Modeling Episode: Part II—Performance Improvement Central California Ozone Study: Modeling Support for 16-20 Sept ’00 Episode Kickoff Meeting Sacramento, CA T. W. Tesche 24 October 2003
Part II: Presentation Overview • Regulatory Modeling Episode Development Issues • Components of the Model Performance Evaluation (MPE) • Model Performance Improvement Process • Potential Sources of Model Performance Problems • Meteorology • Emissions • Other • Diagnostic Tools and Analyses • Process Analysis • Emissions Inventory Sensitivity/Uncertainty Analyses • Use of Aircraft and other Aloft Air Quality and Met Data • Initial Diagnostic Steps
Regulatory Modeling Episode Development Issues • Modeling should adhere to EPA photochemical modeling guidance • EPA guidance requires modeling protocols and adherence to them • Performance goals are identified by EPA for 1-hr SIP modeling • Schedule and resource constraints limit ‘full-science’ approach to model base case development and evaluation (i.e., limited multispecies, compensating error, and alternative base case analyses) • Weight of evidence analyses more strongly encouraged given growing experience in regulatory applications
Remember… • Making policy decisions based on mathematical models is like marriage… at some point you decide that you can live with certain flaws and trade-offs. Student’s answer to a test in one of Prof. Jeffries’ Classes
Components of the Model Performance Evaluation (MPE) • Initial Screening (i.e., the ‘big picture’) • Refined diagnostic evaluations • Sensitivity/uncertainty experiments • Corroborative modeling (e.g., other grid models, observation based models) • ‘Weight of Evidence’ analyses • Overall assessment of episode suitability for use in regulatory modeling • Alternative base case and compensatory error analyses
Components of the Model Evaluation • Operational Evaluation. Tests model ability to estimate 1-hr ground-level ozone concentrations at regulatory monitors. • Diagnostic Evaluation. Tests model ability to estimate ozone precursor and product species and species ratios (e.g., NO, NO2, VOCs, NOy, NOz, CO), species ratios (e.g., VOC/NOx), associated oxidants (H2O2, HNO3), other ‘tracer’ species (CO), and the temporal and spatial variations and mass budgets of key species. • Mechanistic/Scientific Evaluation. Tests model ability to predict response of ozone and product species to changes in variables such as meteorology, emissions, land use, and so on. • Probabilistic Evaluation. Accounts for uncertainties associated with model predictions and observations of ozone and precursor species. • Comparative Evaluation. Quantifies differences between alternative model codes (including preprocessors such as MM5) , configurations, or operation modes; emphasis normally on operational evaluation investigative methods.
Principals Governing ModelPerformance Improvement • Alternations to model inputs, science algorithms, user-specified parameters, core code, and preprocessor algorithms should be technically justified • Alternations to model inputs should be documented • Alterations, where significant, should be vetted with study sponsors and technical review committee • Diagnostic analyses and model inputs alterations must fit with regulatory timeframe and project resources • Process should align with EPA modeling guidelines
Potential Sources of Model Performance Difficulties • Meteorology • Vertical turbulent mixing rates, maxima, and minima • Wind speed and direction errors • Planetary boundary layer (PBL) height predictions • Surface temperature errors (local & regional cool bias, nighttime warm bias, daytime cool bias) • Potential errors in surface and RASS temperature measurements at some stations; uncertainties in measurement heights • Uncertain conformance with aloft measurements (aircraft, fixed profilers, sondes)
Potential Sources of Model Performance Difficulties • Biogenic Emissions • Uncertainty in proper selection of surface temperature for input to biogenic emissions models • Apparently overestimated PAR (i.e., photosynthetically active radiation) inputs to biogenic emissions model • California isoprene emissions likely underestimated by 50% or more (Geron et al., 2001) • Implementation of representative canopy height
Potential Sources of Model Performance Difficulties • On Road Motor Vehicle Emissions • Potential underestimation in motor vehicle VOC and/or NOx emissions inventory (Harley et al., 2003) • Concerns over adequacy of weekendtraffic emissions • Alternative temperature and RH inputs (from met. processors) to motor vehicle model yield different motor vehicle emissions estimates
Potential Sources of Model Performance Difficulties • Area, Point Source, Non-Road Emissions • Potentially missing or poorly characterized wildfires • Non-road NOx emissions from small engines, pumps, etc. • Potential old (erroneous) point source data from SARMAP • Concerns over adequacy of point source defaults based on source type
Potential Sources of Model Performance Difficulties • ICs/BCs/Model Structure • Optimal number of vertical layers and horizontal grid resolution (how determined?) • Adequacy of spin-up period (how determined?) • Reasonableness and impact of BCs (including BCs aloft) on ground level ozone • Need for fine grid (e.g. 1.33 km) resolution within domain?
Diagnostic Tools and Analysesto Be Considered • Sensitivity analysis (e.g., alternative MM5 PBL schemes, boundary conditions, PBL height patches, choice of Kv diffusivities, various emissions inventory sensitivity runs) • Process Analysis (IPR & IRR) • Ozone source apportionment • DDM, brute force methods • Uncertainty analyses • Imputed VOC and/or NOx emissions to address suspected methodological biases in current emissions estimation procedures
Elements of Process Analysis • Integrated Reaction Rate Analysis (IRR), a chemical budget analysis of radicals, NOy, Ox, and O3 • Integrated Process Analysis (IPR), a local budget analysis of chemical, transport, deposition, and emissions processes • Example Application of PA to Houston SIP
Integrated Reaction Rate (IRR) Analysis Ox Production [O3 production + NO oxidation] on 8 Sept ’93 Base Case Ozone.
Integrated Reaction Rate (IRR) Analysis Impact of Isoprene Emissions on 8 Sept ’93 Base Case Ozone
Integrated Process Rate (IPR) Analysis Ozone Process Analysis for Croquet [Cell 26,31,1] on 8 Sept ‘93
Integrated Process Rate (IPR) Analysis NO2 Process Analysis for Croquet [Cell 26,31,1] on 8 Sept ‘93
Supporting Diagnostic Analyses E-W Vertical Ozone Slice Plot on 8 Sept ’93 Near Croquet Cell
Supporting Diagnostic Analyses Level 1 (0–20 m) Level 2 (20–80m) Stagnant winds in Croquet region (level 1) and strong NW winds in adjacent layer (Level 2) produce sustained vertical advection of precursors aloft: 8 Sept ’93, 0300-0400 CST
ARBProject Officer Task 5Document Refinements, Data Base TransferD. McNally, Alpine T. Tesche, Alpine R. Morris, Environ G. Yarwood, Environ G. Mansell, EnvironH. Jeffries, OThree Task 6Management, Meetings, ReportingT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Task 4Model Performance EvaluationT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Y. Kimura, OThree Task 3Establish RefinedCAMx Base CaseD. McNally, Alpine T. Tesche, Alpine C. Loomis, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Y. Kimura, OThree Task 2Model Performance Improvement PlanT. Tesche, AlpineD. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine G. Wilkinson, Alpine R. Morris, Environ G. Yarwood, EnvironH. Jeffries, OThree Task 1Evaluate Modeling Assumptions and ProceduresT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine G. Wilkinson, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree CCOSTechnical Committee Task 5Document Refinements, Data Base TransferD. McNally, Alpine T. Tesche, Alpine R. Morris, Environ G. Yarwood, Environ G. Mansell, EnvironH. Jeffries, OThree Task 6Management, Meetings, ReportingT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Task 4Model Performance EvaluationT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Y. Kimura, OThree Task 3Establish RefinedCAMx Base CaseD. McNally, Alpine T. Tesche, Alpine C. Loomis, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree Y. Kimura, OThree Dr. Ajith Kaduwela ARB Project Officer Task 2Model Performance Improvement PlanT. Tesche, AlpineD. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine G. Wilkinson, Alpine R. Morris, Environ G. Yarwood, EnvironH. Jeffries, OThree Task 1Evaluate Modeling Assumptions and ProceduresT. Tesche, Alpine D. McNally, Alpine C. Loomis, Alpine G. Stella, Alpine G. Wilkinson, Alpine R. Morris, Environ G. Yarwood, Environ C. Emery, Environ G. Mansell, EnvironH. Jeffries, OThree CCOSTechnical Committee T. W. Tesche, AlpineProject Manager T. W. Tesche, AlpineProject Manager T. W. Tesche, AlpineD. McNally, AlpineCo-Principal Investigators T. W. Tesche, AlpineD. McNally, AlpineCo-Principal Investigators Imputed Inventory Factors Used In Houston SIP Modeling
Initial Diagnostic Steps • Acquisition of relevant CCOS modeling datasets (e.g., emissions, aerometric, aircraft, profiler, sonde, landuse) • Evaluation of pertinent meteorological simulations (MM5: NOAA/ARB/Alpine; CALMET: ARB) • Evaluation of initial CAMx simulations (e.g. ARB004) • Synthesis of modeling experience with other episodes and domains (e.g. ENVIRON Bay Area, UCR CCOS modeling) • Development of Model Performance Improvement Plan (MPIP)