1 / 19

Verification of C&V Forecasts Jennifer Mahoney and Barbara Brown 19 April 2001

Verification of C&V Forecasts Jennifer Mahoney and Barbara Brown 19 April 2001. Outline. Philosophy/background Role of QAG General principles General issues Issues related to verification of C&V forecasts Background What we have been doing? Mechanics Verification approaches

anson
Download Presentation

Verification of C&V Forecasts Jennifer Mahoney and Barbara Brown 19 April 2001

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Verification of C&V ForecastsJennifer Mahoney and Barbara Brown19 April 2001

  2. Outline • Philosophy/background • Role of QAG • General principles • General issues • Issues related to verification of C&V forecasts • Background • What we have been doing? • Mechanics • Verification approaches • Point vs grid methods • Verification measures • Current RTVS work • RTVS – How the system can be used for the C&V PDT • Plans

  3. AWC/NWS QAG PDTs RTVS AWRP Product Quality Assessment

  4. PDTs: Developmental verification feeds back into product development Identify appropriate verification observations Work with QAG to develop and test verification methods Provide data/results to QAG for TEG/O process RTVS: Long-term verification Evaluation of operational products Development of baseline Provide results for NWS reports Real-time evaluation of experimental products Example: Algorithm intercomparison exercises Implement new/improved methods and data, as available Provide data/results to QAG for TEG/O process Develop operational verification system for AWC Functions

  5. AWC/NWS: Provide guidance to QAG on RTVS development Monitor RTVS results Coordinate some aspects of TEG/O Provide feedback and guidance to forecasters QAG: Development and testing of verification methods Provide independent verification of products and forecasts (for PDTs, operational products, etc.) TEG/O Quality Assessment plans and reports Advise PDTs on verification approaches Functions (cont.)

  6. Summary: QAG • QAG works with product development teams and users (AWC, others) to develop, implement and test scientifically valid and statistically meaningful verification methods. • Includes research to identify appropriate observations and to develop methods • QAG provides independent testing of products and produces assessment plans and reports for the TEG/O process, using statistically and scientifically valid methods.

  7. General principles of verification • Forecast quality is only one aspect of forecast “goodness” Quality¹ Usefulness • Scientific and statistical validity • Dimensionality: one number won’t do the job!! • By nature, forecast verification is a multi-dimensional problem • Different measures are concerned with different attributes (Sometimes different measures even give contradictory results) • It isn’t easy to show true improvements in forecasting systems - trade-offs between scores (e.g., POD, FAR)

  8. General issues • Matching forecasts to appropriate observations • Need to match the events, including the spatial and temporal characteristics • Observation must be clearly defined, and not dependent on the forecast • Selection of appropriate measures What are the important characteristics to be evaluated? Ex: Bias vs. Accuracy Discrimination vs. Threat • Relative vs. Absolute verification • Selection of appropriate standard of comparison • Day-to-day variability in results • Grid-based methods are stringent • Don’t account for small errors in timing or location • Not diagnostic (I.e., don’t provide information to help “fix” the forecasts

  9. Issues Related to Verification ofNational-scale C&V Forecasts • Spatial continuity of observations • Accounting for temporal and spatial errors • Need for more diagnostic approaches • Use of “other” observations for verification (e.g., PIREPs, satellite) • Verification of probabilistic forecasts • Develop appropriate methods for spatial forecasts • Others?

  10. Current C/V work on RTVS • Evaluating IFR AIRMETs issued by AWC • On-going evaluation using RTVS since 1997 • http://www-ad.fsl.noaa.gov/afra/rtvs • link to: Real-Time Verification System • Using Metar reports to verify the forecasts • Enhancing the verification methods and approaches

  11. Mechanics • Forecast/observation matching approaches • Point • Grid • Time windows • At valid time • Over verification period • Stratifications • With amendments • Without amendments

  12. Verification Approaches Point Method • PODy = 0.40 • PODn = 0.76 • FAR = 0.42 • Bias = 0.70 • CSI = 0.30

  13. Verification Approaches Grid Method • 1 observation in box = Yes • Grid box touches polygon = In

  14. Verification Approaches Grid Method M M • PODy = 0.54 • PODn = 0.60 • FAR = 0.62 • Bias = 1.45 • CSI = 0.28 M M H H F F H H H H F F F F F F F F M

  15. H = Hits M = Misses F = False Alarms PODy = H / (H + M) PODn = proportion of “No” area that was correctly forecast to be “No” FAR = F / (H + F) Bias = (F + H) / (M + H) CSI = H / (M + H + F) Measures “relative accuracy” TSS = PODy + PODn -1 Measures “discrimination” between Yes and No observations PODy Measures proportion of observed area that is correctly forecast to be “Yes” PODn Measures proportion of area that is correctly forecast to be “No” FAR Measures proportion of forecast convective area that is incorrect Bias Measures the extent of over- or under- forecasting Skill scores (Heidke, Gilbert) Measure the improvement in percent correct and CSI, respectively over what’s expected by chance Forecast storm F H M Observed storm “Standard” Verification Measures

  16. The Real Time Verification System (RTVS) For the purposes of….. • Long-term assessment of NWS forecasts • Algorithm development and forecast improvement • NWS-forecaster assistance

  17. The Real-Time Verification System(RTVS) RTVS Components • Real-Time Continuous Data Ingest • Observation and grid ingest • Grid - to - observation interpolation • Local storage of forecasts/observation pairs • User-Specified Statistical Analysis and Display via the…. Web-based interactive graphical user interface

  18. Plans • Develop, test, and implement diagnostic and operationally meaningful verification methods • First: Determine what the relevant questions are • Enhance methods as needed (i.e., as new observations become available, new types of forecasts) • Work closely with the rest of the PDT on this development • Develop infrastructure so that forecasts can be verified • Enable RTVS and post-analysis software to handle PDT-developed algorithms and enhancements to verification methods • Set up real-time processing and graphical user interface

  19. Plans (cont.) • Provide on-going, independent, comprehensive assessment(s) • Begin an intercomparison exercise for C&V components and final forecasts in Fall 2002 • Real-time verification (RTVS) • In-depth post-analysis • Incorporate new verification methods, observations, and forecasting systems as they are available • Leverage with other verification work (including operational C&V forecasts)

More Related