150 likes | 394 Views
NWS TAF Verification. Brandi Richardson NWS Shreveport, LA. Do we care how our forecasts verify?. NO!. Do we care how our forecasts verify?. Yes! The NWS measures verification by many means Probability of Detection (POD) False Alarm Ratio (FAR) Critical Success Index (CSI)
E N D
NWS TAF Verification Brandi Richardson NWS Shreveport, LA
Do we care how our forecasts verify? • Yes! • The NWS measures verification by many means • Probability of Detection (POD) • False Alarm Ratio (FAR) • Critical Success Index (CSI) • Percent Improvement • Set goals for verification • Local offices add own flavor • Total IFR (IFR, LIFR, VLIFR)
Why is verification important? • Need to know what to improve • Lose credibility if too many forecasts are wrong • Lose customers • Lose jobs • Additional training • New techniques • Improved model guidance • Need to know what we are doing well
NWS TAF Verification • TAFs evaluated 12 times per hour (every five minutes), or 288 times per 24-hour period • TAFs compared to ASOS five-minute observations • ASOS = Automated Surface Observing System, located at TAF airports • Stats calculated by flight category • i.e., VFR, MVFR, IFR, LIFR, VLIFR
Probability of Detection • How often did we correctly forecast a particular flight category to occur? • Also known as “Accuracy” • POD = V/(V+M) • V = forecasted and verified events • Ex: IFR conditions forecasted…IFR conditions occurred • M = missed events • Ex: VFR conditions forecasted…IFR conditions occurred • Ranges from 0 – 1, 1 being perfect
False Alarm Ratio • How often did we forecast a particular flight category to occur that did not occur? • i.e., how often did we “cry wolf”? • FAR = U/(U+V) • U = forecasted and unverified • Ex: IFR forecasted…VRF occurred • V = forecasted and verified events • Ex: IFR conditions forecasted…IFR conditions occurred • Ranges from 0 – 1, 0 being perfect
Critical Success Index • CSI = V/(V+M+U) • V = forecasted and verified events • Ex: IFR conditions forecasted…IFR conditions occurred • M = missed events • Ex: VFR conditions forecasted…IFR conditions occurred • U = forecasted and unverified • Ex: IFR forecasted…VRF occurred • Ranges from 0 – 1, 1 being perfect • Incorporates both POD and FAR • Overall score of performance
Percent Improvement • Forecaster CSI vs. Model Guidance CSI • Did we beat the model? IFR will prevail… IFR?! It’s July and dew points are in the 20s! Take that! Forecaster GFS
2009 NWS Goals • The NWS has set goals for TAF forecasts • For total IFR (includes IFR, LIFR, and VLIFR) • POD ≥ 0.640 (64%) • FAR ≤ 0.430 (43%) • How do we measure up?...
The Bottom Line • Sometimes we do get the forecast wrong. • Examination of TAF verification statistics helps to find our weaknesses and allows us to find ways to improve our forecasts. • The NWS strives to provide quality products and services to our aviation customers and partners.