570 likes | 726 Views
Agenda. SG4 – Presentations on the Questionnaire. Short presentations (max 10 min) by: Ana Miranda (PT) David Carruthers (UK) Hans Backström (SE) Helge Olesen (DK) Marcus Hirtl (AT) Mihaela Mircea, Guido Pirovano (IT). Background. The benchmarking procedure. JRC.
E N D
SG4 – Presentations on the Questionnaire • Short presentations (max 10 min) by: • Ana Miranda (PT) • David Carruthers (UK) • Hans Backström (SE) • Helge Olesen (DK) • Marcus Hirtl (AT) • Mihaela Mircea, Guido Pirovano (IT)
The benchmarking procedure JRC USER Data Extraction Facility Model results DELTA BENCHMARKING service Model performance Evaluation reports
Since the Oslo meeting (sep. 2010) • Document “ DELTA concepts” sent to SG4 participants (March 11) • Distribution of DELTA tool & utilities (about 20 users) • SG4 Web page created (http://aqm.jrc.it/DELTA) • Use of DELTA on different datasets (POMI, Madrid, London) • User feedback questionnaire
The DELTA tool • Intended for rapid diagnostics by single users (at home) • Focus mostly on surface measurement-model pairs “independence” of scale • Focus on AQD related pollutants on a yearly period (but AQ related input data also checked) • Exploration and benchmarking modes
Outline • Content of the performance report • Links between Target and more “traditional” indicators (an analysis based on 3 datasets) • A “first-guess” for criteria/goals • Comparison with RDE • Observation uncertainty • Proposed update to the report template
Contentof the performance report (1) Constraints • Should include a set of statistical indicators and diagrams complete enough to capture the main aspects of model performance but limited enough to fit in one page summary • Keep a similar template for all pollutants and spatial scales (but differences in terms of criteria/gals). • Restricted to AQD needs. Currently proposed for O3 8h daily max, NO2 hourly and PM10 daily. • Developed (at least first) for assessment purposes • Should include performances criteria and goals
MEF < 0 Contentof the performance report (2) R=0.7 RMSE/σo OU Criteria: Acceptable performance for a given type of application (e.g. PM: MFE=75%, MFB=+/-60%) Goal: Best performance a model should aim to reach given its current capabilities (e.g. PM: MFE=50%, MFB=+/-30%)
Checks on data availability for each stations • 75% for time averaging (e.g. 18h at least per day) • 90% available on total (e.g. >328 days/year) MFB=0.67 Contentof the performance report (3) 90% concept for indicators
Links between Target and more “traditional” indicators (an analysis based on 3 datasets)
Links between Target and more “traditional” indicators (1) R Target indicator = RMSE / SIgO Bias SigM/SigO CRMSE FAC2
61 monitoring sites suburban, urban and rural background 5 models: CHIMERE, TCAM, CAMX, RCG, MINNI Year:2005 Domain resolution:6x6km2 O3 – PM10 Examples on 3 datasets Po - Valley Madrid 10 monitoring sites urban background 1 model: WRF-CMAQ Year: 2007 Domain resolution: 1x1 km2 O3 – NO2 London 107 monitoring sites suburban/urban background, kerbside and roadside 1 model: ADMS Year: 2008 NO2 – O3 – PM10
Links between Target and more “traditional” indicators (3) Methodology to fix “first guess” criterias • Based on real datasets, start by analysing how the bias criteria (MFB) proposed by Boylan and Russel (2005) compares to Target. • Fix a criteria for the target indicator which is consistent with the MFB criteria • Fix values for the other statistical indicators (R, StdDev ratio, FAC2) to be consistent with the assigned criteria on the Target value
How to connect Target to more accessible indicators? (3) EXAMPLE 1: Po_valley (PM10) Crit Target=1 T: 58% RDE: 83%
How to connect Target to more accessible indicators? (3) EXAMPLE 2: Po_valley (PM10) Crit Target=1 T: 32% RDE: 95%
How to connect Target to more accessible indicators? (3) EXAMPLE 3: London (PM10) Crit Target=1 T: 96% RDE: 100%
How to connect Target to more accessible indicators? (3) EXAMPLE 4: Po_valley (O3) Crit Target=0.8 T: 70% RDE: 96%
How to connect Target to more accessible indicators? (3) EXAMPLE 5: Madrid (O3) Crit Target=0.8 T: 66% RDE: 100%
How to connect Target to more accessible indicators? (3) EXAMPLE 6: London (NO2) Crit Target=1 T: 77% RDE: 94%
How to connect Target to more accessible indicators? (3) EXAMPLE 7: Madrid (NO2) Crit Target=1 T: 60% RDE: 100%
A “first-guess” for criteria/goals (1) • NOTE: Boylan and Russel MFB criteria • is proposed based on urban to regional scale modelling (from 4 to 36 km spatial resolution) • addresses only O3 and PM10
A “first-guess” for criteria/goals (2) • Different criteria are currently proposed for O3-8h, PM10-daily and NO2-hourly. Although spatial-scale and time average dependency are possible, they are not considered up to now (point of discussion) • Scale is intended in terms of spatial resolution, linked to monitoring station type: • Regional Rural background • Urban Urban & suburban background • Local All urban stations (incl. roadside & kerbside) • Criteria probably need to be developed for yearly averaged values • Performance goals have arbitrarily been fixed to a 20% more stringent value • 3 datasets is not ENOUGH!
R Target PM10 MFB FAC2 RDE SM/SO
R Target O3 MFB FAC2 RDE SM/SO
R Target NO2 MFB FAC2 RDE SM/SO
How these criteria compares to RDE? (2) Station Osio Sotto, POMI (NO2 - RCG) Target: 2.19 MFB: 73% FAC2: 41% R: 0.39 SigM/SigO: 1.49 RDE=11%
How these criteria compares to RDE? (3) Station EA1, London (NO2 - ADMS) Target: 0.82 MFB: 8% FAC2: 89% R: 0.73 SigM/SigO: 1.14 RDE = 56%
S0 RMSE/S0 About observation uncertainty (1) AQD
About observation uncertainty (2) O3 NO2 PM10 ADMS, London, 2008
Proposed update to the report template ✘ ✘ ✘ ✘ SigO/SigM ✓ SigO > SigM SigO < SigM
DELTA: Exploration mode (1) • Exploration: • Time selection (period, averaging time, season, day/night-time, max/min/mean) • Information overlay (models, scenarios, variables, stations) • Spatial analysis (color codes vs. 2D maps)
DELTA: Exploration mode (3) Model V1 vs. Model V2 Upgrade
DELTA: Exploration mode (4) Upgrade
DELTA developments • Short term (Autumn 2011) • Flexible use of benchmarking mode and production of “pdf” or postscript reports • On-click mouse information • Windows/Linux portability • Station grouping mode • Longer term (2011-2012) • Inclusion of planning applications • Extension of benchmarking for annual averages (?) • Inclusion of PM2.5
DELTA developments Model responses to emission reductions depend on the geographical location, the model scale, meteorological year… • Require a series of simulations with fixed emission reductions for main precursors (NOx, VOC, NH3, SO2, PPM) and analyze difference in behavior. Problem: • No observations available • Reference model ? • Joint exercices • Analysis of spatio-temporal emission patterns in provided data (e.g week vs. week-end day, DEFRA 2011) • DELTA expl. Mode (Links with SG3)
Case & Report DB DELTA Benchmark JRC USER Data Extraction Facility Model info Model results DELTA BENCHMARKING service Model performance Evaluation reports Deadline: end 2012