230 likes | 344 Views
Forecast Verification Research. Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office. Verification working group members. Beth Ebert (BOM, Australia) Laurie Wilson (CMC, Canada) Barb Brown (NCAR, USA) Barbara Casati (Ouranos, Canada)
E N D
Forecast Verification Research Barbara Brown, NCAR With thanks to Beth Ebert and Laurie Wilson S2S Workshop, 5-7 Feb 2013, Met Office
Verification working group members • Beth Ebert (BOM, Australia) • Laurie Wilson (CMC, Canada) • Barb Brown (NCAR, USA) • Barbara Casati (Ouranos, Canada) • Caio Coelho (CPTEC, Brazil) • Anna Ghelli (ECMWF, UK) • Martin Göber (DWD, Germany) • Simon Mason (IRI, USA) • Marion Mittermaier (Met Office, UK) • Pertti Nurmi (FMI, Finland) • Joel Stein (Météo-France) • Yuejian Zhu (NCEP, USA)
Aims Verification component of WWRP, in collaboration with WGNE, WCRP, CBS (“Joint” between WWRP and WGNE) • Develop and promote new verification methods • Training on verification methodologies • Ensure forecast verification is relevantto users • Encourage sharing of observational data • Promote importance of verification as a vital part of experiments • Promote collaboration among verification scientists, model developers and forecast providers
Relationships / collaboration WGCM WGNE TIGGE SDS-WAS HyMeX Polar Prediction SWFDP YOTC CG-FV Subseasonal to Seasonal Prediction WGSIP SRNWP COST-731
FDPs and RDPs Sydney 2000 FDP Beijing 2008 FDP/RDP SNOW-V10 RDP FROST-14 FDP/RDP MAP D-PHASE Typhoon Landfall FDP Severe Weather FDP
SNOW-V10 • Nowcast and regional model verification at obs sites • User-oriented verification • Tuned to decision thresholds of VANOC, whole Olympic period • Model-oriented verification • Model forecasts verified in parallel, January to August 2010 • Status • Significant effort to process and quality-control observations • Multiple observations at some sites observation error
Wind speed verification (model-oriented) Visibility verification(user-oriented)
FROST-14 User-focused verification • Threshold-based as in SNOW-V10 • Timing of events – onset, duration, cessation • Real-time verification • Road weather forecasts? Model-focused verification • Neighborhood verification of high-resolution NWP • Spatial verification of ensembles Account for observation uncertainty
Promotion of best practice Recommended methods for evaluating cloud and related parameters
Promotion of best practice Verification of tropical cyclone forecasts Introduction Observations and analyses Forecasts Current practice in TC verification – deterministic forecasts Current verification practice – Probabilistic forecasts and ensembles Verification of monthly and seasonal tropical cyclone forecasts Experimental verification methods Comparing forecasts Presentation of verification results
Beyond track and intensity… Track error distribution TCgenesis Wind speed Precipitation (MODE spatial method)
Promotion of best practice • Verification of forecasts from mesoscale models (early DRAFT) • Purposes of verification • Choices to be made • Surface and/or upper-air verification? • Point-wise and/or spatial verification? • Proposal for 2nd Spatial Verification Intercomparison Project in collaboration with Short-Range NWP (SRNWP)
Spatial Verification Method Intercomparison Project • International comparison of many new spatial verification methods • Phase 1 (precipitation) completed • Methods applied by researchers to same datasets (precipitation; perturbed cases; idealized cases) • Subjective forecast evaluations • Weather and Forecasting special collection 2009-2010 • Phase 2 in planning stage • Complex terrain • MAP D-PHASE / COPS dataset • Wind and precipitation, timing errors
Outreach and training http://www.cawcr.gov.au/projects/verification/ • Verification workshops and tutorials • On-site, travelling • SWFDP (e.g., east Africa) • EUMETCAL training modules • Verification web page • Sharing of tools
5th International Verification Methods Workshop Melbourne 2011 Tutorial • 32 students from 23 countries • Lectures and exercises (took tools home) • Group projects - presented at workshop Workshop • ~120 participants • Topics: • Ensembles and probabilistic forecasts • Seasonal and climate • Aviation verification • User-oriented verification • Diagnostic methods and tools • Tropical cyclones and high impact weather • Weather warning verification • Uncertainty • Special issue of Meteorol. Applications in early 2013
NWP climate change decadal prediction seasonal prediction sub- seasonal prediction global very short range regional Spatial scale nowcasts local point minutes hours days weeks months years decades forecast aggregation time Seamless verification Seamless forecasts - consistent across space/time scales single modelling system or blended likely to be probabilistic / ensemble
"Seamless verification" – consistent across space/time scales • Modelling perspective – is my model doing the right thing? • Process approaches • LES-style verification of NWP runs (first few hours) • T-AMIP style verification of coupled / climate runs (first few days) • Single column model • Statistical approaches • Spatial and temporal spectra • Spread-skill • Marginal distributions (histograms, etc.) Perkins et al., J.Clim. 2007
"Seamless verification" – consistent across space/time scales • User perspective – can I use this forecast to help me make a better decision? • Neighborhood approaches - spatial and temporal scales with useful skill • Generalized discrimination score (Mason & Weigel, MWR 2009) • consistent treatment of binary, multi-category, continuous, probabilistic forecasts • Calibration - accounting for space-time dependence of bias and accuracy? • Conditional verification based on larger scale regime • Extreme Forecast Index (EFI) approach for extremes • JWGFVR activity • Proposal for research in verifying forecasts in weather-climate interface • Assessment component of UK INTEGRATE project
Questions • What should be the role of JWGFVR in S2S? • Defining protocols? Metrics? • Guidance on methods? • Participation in activities? • Linking forecasting and applications? • What should be the interaction with other WMO verification activities? E.g., Standardized Verification System for Long-range Forecasts (SVS-LRF); WGNE/WGCM Climate Metrics Panel • How do metrics need to change for S2S? • How do we cope with small sample sizes • Is a common set of metrics required for S2S?
Database comments • Database should be designed to allow easy access for • Applications • Verification • Will need observations for evaluations and applications • Will these (or links to these) be included in the database? • Lack of obs can be a big challenge / detriment to use of the database • Access to data • For applications and verification often will not want a whole field or set of fields • Also may want to be able to examine time series of forecasts at points • Data formats and access can limit uses
Opportunities! • New challenges • Methods for evaluating extremes • Sorting out some of the thorny problems (small sample sizes, limited observations, etc.) • Defining meaningful metrics associated with research questions • Making a useful connection between forecast performance and forecast usefulness/value • Application areas (e.g., precipitation onset in Africa) • A new research area • Using spatial methods for evaluation of S2S forecast patterns