180 likes | 305 Views
Enhanced User and Forecaster Oriented TAF Quality Assessment CAeM-XIV TECO 6 February 2010 Queenie CC Lam. World Meteorological Organization Working together in weather, climate and water. WMO. CAeM-XIV TECO 6 Feb 2010. TAF Quality Assessment. Needs from the aspects of quality management
E N D
Enhanced User and Forecaster Oriented TAF Quality Assessment CAeM-XIV TECO 6 February 2010 Queenie CC Lam World Meteorological OrganizationWorking together in weather, climate and water WMO CAeM-XIV TECO 6 Feb 2010
TAF Quality Assessment Needs from the aspects of quality management user focus use of meteorological information for decision making in air navigation Issues Mismatches between the criteria for inclusion of change groups and amendment (ICAO Annex 3, Appendix 5) and those for accurate forecast (ICAO Annex 3, Attachment B) Shortfalls in the existing TAF performance measure in ICAO Annex 3 Attachment B
TAF Accuracy...Who cares? Why? Stakeholders Users Pilots and Airlines – pre-flight planning and in-flight re-planning Air Traffic Control - better management of air traffic flow and control Forecasters better understanding of strengths and weaknesses continuous professional development (learn from mistakes, remove bias) Developers Forecast techniques development Validate changes to underpinning model or nowcasting, to improve overall forecast performance Management Identification of areas of improvement and resource allocation Provide evidence on the forecast quality to users to increase their confidence in using the forecast for better decision making 3
TAF Verification Quality management forecast performance as a key Quality Objective basis for continuous improvement of service ICAO Annex 3 operationally desirable accuracy (Attachment B) 4
Mismatches • Remark: The number of change and probability groups should be kept to a minimum and should not normally exceed 5 groups 5
TAF Accuracy(% of cases within range or % of accurate forecasts -existing performance measure in ICAO Annex 3 Attachment B) Merits simple and easy-to-understand convenient for setting a globally applicable level of desirable accuracy Shortfalls may not reflect skills properly Target percentage of accurate forecasts could have been met even by never forecasting significant weather at most airports fail to delineate performance in forecasting high operational impact events need special treatment to cater for change groups 6
Way Forward to Address the Issues Align the change group criteria with the desirable forecast accuracy Enhance the existing TAF verification system with additional performance metrics
Proposal for an Enhanced TAF Verification System Objective reflect forecast skill (with respect to no-skill method such as climatological or persistence method) generate useful verification information for use by different stakeholders Approach supplement with additional performance metrics, e.g. by using contingency table method, to reflect forecast skill, especially in forecasting high operational impact events take climatology, frequency of events into account adapt based on local forecasting practice consult users, e.g. local ATC 8
Use of Contingency Table Delineate performance in forecasting high operational impact events low visibility low cloud ceiling moderate/heavy precipitation, thunderstorms categorization based on change group criteria 9
An example: Austro ControlUse of High/Low-contingency Table Method(Reference: Mahringer, G., 2008: Terminal aerodrome forecast verification in Austro Control using time windows and ranges of forecast conditions, Meteorol. Appl. 15, p.113-123) Decompose TAF to hourly forecasts a range of forecast conditions is covered in the change group Two contingency tables for each weather element H-table HIGHEST forecast vs HIGHEST observed condition in each hour L-table LOWEST forecast vs LOWEST observed condition in each hour Source of observational data METAR/SPECI 10
H/L-Contingency Table Method for Verification of Visibility Forecast 11
Performance Metrics Generate from 2-dimensional contingency tables 12
Examples of Performance Metrics Reference: Jolliffe IT, Stephenson DB (Ed.), 2003 : Forecast Verification: A Practitioner’s Guide in Atmospheric Science. John Wiley & Sons, Chichester, UK.
Result Presentation for Users (2)Forecast Value costs of airline operations weather-related delays safety aspects planning ahead reduce costs economic value of the forecasts if cost of precautionary measures and loss due to weather event are known 15
Result Presentation for Forecasters FTOS31 LOWM 281100 TAF LOWL 281130Z 2812/0118 27006KT 9999 SCT012 SCT025 BECMG 2819/2822 VRB02KT 0500 BCFG FEW012 BECMG 2822/2824 0200 FG VV001 BECMG 0107/0110 09005KT 3000 BR OVC005 TEMPO 1018 6000 SCT010= 16
Principles of TAF Verification From User perspective e.g. requiring accuracy within a certain range; forecasts of multiple states of the atmosphere within a single time period Normalization of verification scores % of accurate forecast heavily influenced by climatology of the location and frequency of changes in atmospheric conditions Enable comparison between airports Alignment between the criteria for inclusion of change groups and amendment and those for accurate forecast