1 / 35

WWRP/WGNE Joint Working Group on Forecast Verification Research

This joint working group aims to advance the verification component of weather forecasts by developing and promoting new methods, providing training, and ensuring relevance to users. Collaboration among scientists and providers emphasizes the importance of accurate verification. Real-time systems offer timely feedback and inter-comparisons for forecast accuracy. Training includes reflectivity, T-storm strike probability, and more. Publications and recommendations highlight key strategies for verifying different types of forecasts. The group also addresses user-oriented verification for specific events and parallel model forecasts. Continuous improvement and innovation in spatial verification methods are key to enhancing forecast accuracy and reliability for various weather phenomena. 8 Relevant

angelicae
Download Presentation

WWRP/WGNE Joint Working Group on Forecast Verification Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WWRP/WGNE Joint Working Group on Forecast Verification Research Beth Ebert, JWGFVR co-chair WGNR meeting, Geneva, 8-10 Feb 2011

  2. Aims • Verification component of WWRP • Develop and promote new verification methods • Training on verification methodologies • Ensure forecast verification is relevant to users • Encourage sharing of observational data • Promote importance of verification as a vital part of experiments • Promote collaboration among verification scientists, model developers and forecast providers • Collaborate with WGNE, WCRP, CBS

  3. Working group members • Beth Ebert (BOM, Australia) • Laurie Wilson (CMC, Canada) • Barb Brown (NCAR, USA) • Barbara Casati (Ouranos, Canada) • Caio Coelho (CPTEC, Brazil) • Anna Ghelli (ECMWF, UK) • Martin Göber (DWD, Germany) • Simon Mason (IRI, USA) • Marion Mittermaier (Met Office, UK) • Pertti Nurmi (FMI, Finland) • Joel Stein (Météo-France) • Yuejian Zhu (NCEP, USA)

  4. 0.3 0.25 0.2 0.15 SkillMAE Bias (mm) 0.1 0.05 0 -0.05 -0.1 0 10 20 30 40 50 60 Forecast (min) Sydney 2000 FDP sprog

  5. Sydney 2000 FDP TITAN EXTRAP

  6. MAE Bias Sydney 2000 FDP Boundary position error

  7. Beijing 2008 FDP Real Time Forecast Verification (RTFV) system Fast qualitative and quantitative feedback on forecast system performance in real time • Verification products generated whenever new observations arrive Ability to inter-compare forecast systems 3 levels of complexity • Visual (quick look) • Statistics (quantitative) • Diagnostic (more information)

  8. Training In person Online

  9. Reflectivity, cell tracks, T-storm strike probability Outer Beijing Beijing Precipitation, probability of precipitation Urban Beijing Wind gusts, lightning Domain and variables

  10. Precipitation accumulation Forecasts Observations

  11. Standard verification

  12. Diagnostic verification (non-real time) Power spectral density Contiguous rain area (CRA) Intensity-scale Neighborhood methods Forecast quality metric (FQM)

  13. Survey of BMB forecasters and FDP experts • Real time verification considered very useful • Forecasters preferred scatter plots and quantile-quantile plots • Experts wanted capability to drill down to more specific information • Web interface not user-friendly enough

  14. B08FDP lessons for real time verification • Format and standardization of nowcasts products was critical to making a robust verification system • Difficult to compare "like" products that were produced with slightly different aims (e.g., QPF for warning vs hydrological applications) • Verification system design could be improved • Better display of forecasts with observations • User control of verification parameters for exploring results

  15. SNOW-V10 • Verification plan • User-oriented verification for Olympic period of all forecasts, tuned to decision points of VANOC • Verification of parallel model forecasts for Jan to August 2010 • Nowcast and regional model verification • Rich dataset for user-oriented verification and research • Processing started

  16. SNOW-V10 observations

  17. Suggested categories for SNOW-V10 analysis

  18. Visibility verification

  19. Wind verification

  20. Sochi 2014 Standard verification Possible verification innovations: • Road weather forecasts • Real-time verification • Timing of events – onset, duration, cessation • Verification in the presence of observation uncertainty • Neighborhood verification of high-resolution NWP, including in time-height plane • Spatial verification of ensembles • User-oriented probability forecast verification

  21. Publications Publications Recommendations for verifying deterministic and probabilistic quantitative precipitation forecasts Available on WWRP web site Coming soon: Recommendations for verifying cloud forecasts Later this year (?): Recommendations for verifying tropical cyclone forecasts

  22. Publications January 2008 special issue of Meteorological Applications on forecast verification • Verification review paper authored by JWGFVR members • 18 contributed papers from participants in 2007 International Verification Methods workshop

  23. Publications DVD from 2009 Helsinki Verification Tutorial Video and powerpoint lectures • Verification basics • Continuous verification • Categorical verification • Warnings verification • Probabilistic verification • Spatial verification • Statistical inference Available from WMO

  24. Spatial Verification Method Intercomparison Project http://www.rap.ucar.edu/projects/icp International comparison of many new spatial verification methods Methods applied by researchers to same datasets (precipitation; perturbed cases; idealized cases) Subjective forecast evaluations Workshops: 2007, 2008, 2009 Weather and Forecasting special collection 13 papers on specific methods, 2 overview papers

  25. Spatial Verification Method Intercomparison Project

  26. Facial verification?

  27. Spatial Verification Method Intercomparison Project • Future variables • "Messy" precipitation • Wind • Cloud • Timing errors • Future datasets • MAP D-PHASE • SRNWP / European data • Nowcast dataset(s)?? • A model for verification test bed

  28. Outreach http://www.cawcr.gov.au/projects/verification/ • Strong focus of the WG • EUMETCAL training modules completed (Nurmi, Wilson) • Verification web page • Sharing of tools • Tutorials (traveling and on-site) • ECMWF (Jan 2007) • South Africa (Sept 08) • Helsinki (June 2009)

  29. International Verification Methods Workshops 4th Workshop – Helsinki 2009 Tutorial • 26 students from 24 countries • 3 days • Lectures, hands-on (took tools home) • Group projects - presented at workshop Workshop • ~100 participants • Topics: • User-oriented verification • Verification tools & systems • Coping with obs uncertainty • Weather warning verification • Spatial & scale-sensitive methods • Ensembles • Evaluation of seasonal and climate predictions

  30. Workshop: New verification research Spatial methods applied to: Wind fields Ensemble forecasts http://www.space.fmi.fi/Verification2009/

  31. Hit rate False alarm ratio Workshop: New verification research Diagnostics Extremes http://www.space.fmi.fi/Verification2009/

  32. 5th International Verification Methods Workshop • Melbourne, December 2011 • 3-day tutorial + 3-day scientific workshop • Additional tutorial foci • Verifying seasonal predictions • Brief intro to operational verification systems • Capacity building for FDPs/RDPs, SWFDP, etc.

  33. New focus areas for JWGFVR research • Seamless verification – crossing space/time scales • Ensemble predictions • Warnings / extreme events, especially timing • Aviation • Multivariate verification – joint distributions

  34. Future collaboration with WGNR • SNOW-V10 • Sochi 2014 • WENS? • SWFDP (Africa, SW Pacific, SE Asia) • Lake Victoria FDP • Intend to establish collaborations with SERA on • verification of tropical cyclone forecasts and other high impact weather warnings • Lake Victoria FDP

More Related