1 / 9

Model results

Case & Report DB. DELTA Benchmark. JRC. USER. Data Extraction Facility. Model info. Model results. DELTA. BENCHMARKING service. Model performance Evaluation reports. Deadline: end 2012. What has been done since Oslo meeting.

rcampion
Download Presentation

Model results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Case & Report DB DELTA Benchmark JRC USER Data Extraction Facility Model info Model results DELTA BENCHMARKING service Model performance Evaluation reports Deadline: end 2012

  2. What has been done since Oslo meeting • The DELTA tool has been developed and released (about 20 users) together with the document “Delta concepts and user’s guide” (March 2011). • Within this document a template for reporting model performances has been proposed. • A questionnaire has been sent to SG4 participants to get feedback (10 replies) on: • DELTA • Template for reporting performance

  3. Objectives of SG4 meeting • Get feedback and discuss the template for reporting model performances: • Content of the performance report • Links between Target and more “traditional” indicators (an analysis based on 3 datasets) • A “first-guess” for criteria/goals • Comparison with RDE • Observation uncertainty • Future plans • Deliverables for TSAQ review

  4. About the template • Agreed: • Template format • Use of target as main indicator complemented by additional statistic indicators • The relation between target indicator and other indicators has been clarified • Use of quality bounds for the indicators is fine but more datasets are required to fix values for these bounds. • Performance criteria should not be geographically dependent • RDE is not providing much insight, is not sufficient and can be misleading.

  5. About the template • Agreed: • Exploration mode is extremely important before producing the report • Selection of the stations to be used is crucial (consistence between resolution and station type  SG1) • To do • To be extended to annual averages for NO2 and PM10 • Add info on AOT, SOMO, exceedances • Assess sensitivity of criteria/goals to scale, time averaging… • Assess sensitivity of target indicator to normalisation by standard deviation

  6. About the template • To do as SG4 participants • Use DELTA on own dataset and provide it to the JRC. • Main concern: • Use of the quality objectives for policy

  7. Future plans • Short term (2011) • Delta updates • Templates for annual averages • Medium Term (2012) • Development of the benchmarking service (completion of the procedure) • SG4 report on the interpretation of the performance report and fixing of criteria and goals, example on different datasets. • Extension of the analysis to other datasets

  8. Deliverables for TSAP review • Benchmarking procedure • DELTA  Benchmarking service  ENSEMBLE (+ data prep. Facility) • Recommendations on: • Common template for reporting model performances as complement/substitute to the current RDE indicator. • Quality objectives (to be periodically revised)

More Related