110 likes | 200 Views
Forecasting process, issues and the public. Joe Koval Senior Software Developer The Weather Channel, Atlanta, GA. DICast - D ynamic, I ntegrated Fore CAST System.
E N D
Forecasting process, issues and the public Joe Koval Senior Software Developer The Weather Channel, Atlanta, GA
DICast - Dynamic, Integrated ForeCAST System • System that generates forecasts for user-defined sites and times using a host of meteorological data and variety of forecasting techniques • Initially developed by NCAR/WITI • Significant enhancements by WSI • Further development at TWC
DICast Detail-what happens when DICast runs Integrator 00Z NAM-WRF DMOS 12Z NAM-WRF DMOS 00Z GFS/AVN DMOS Data Ingest 12Z GFS/AVN DMOS 00Z GFS/MRF DMOS 12Z GFS/MRF DMOS Spatial Interpolation 00Z ECMWF DMOS 12Z ECMWF DMOS Local Day Max/Min 00Z NGM MOS Final QC 12Z NGM MOS 00/12Z MAV MOS Units Conversion 06/18Z MAV MOS 00Z MEX MOS 00Z ETA MOS 12Z ETA MOS To Digit Climatology Integration Editor Climo Editor Station List Manager Web Viewer System Monitor
Who are our customers? • All Business to Consumer • Cable subscribers • Online – weather.com • Mobile devices • Portals, etc – Yahoo, Google Earth, others • USA Today
Factors that affect our forecast process • Our forecast intervention focus is on high impact weather • Focus on where the weather matters most and guidance typically performs worst • We’ve developed verification and other tools that help forecasters identify locations and weather situations where the guidance is not likely to perform well
Forecast verification is important to TWC • We have a comprehensive, worldwide six year archive of forecast verification • Verification serves three purposes • Ongoing forecast quality assurance • Feedback to human forecasters for training purposes (ADVISOR product) • R & D – how can we improve our forecasts through development of new/innovative verification products?
…but from surveys we know that traditional verification measures aren’t everything to the public! • Forecast skill as measured by statistics such as Root Mean Square Error, Mean Absolute Error, contingency tables is only one part of how the public measures the value of a forecast • The other three: • Reliability – is the forecast always available when the customer expects it? • Low latency – the forecast needs to be available to the customer almost immediately after it is created…no old forecasts sitting around! • Consistency – The forecast isn’t flip flopping with each update
Other societal aspects of forecast verification • Near misses don’t “verify” with traditional verification measures, but can still impact society • Heavy rain falls in the mountains, but floods the valley • Sensible weather verification • Verification that captures very high impact weather • How do we perform in weather situations that are very dangerous to the public – derechos, tornado outbreaks, blizzards?
Probabilistic forecasting and The Weather Channel • Like much of the weather forecasting community, The Weather Channel is exploring ways to include additional measures of uncertainty in the forecast • This is a debated topic at The Weather Channel • One view is that time constraints limit the feasibility of specifically adding probabilistic information to products on the network broadcast • On camera meteorologists are already time constrained by small time slots • Further, the on-camera meteorologist often already presents some uncertainty information verbally, so while it is not in the product, it is added when the product is presented by the on camera meteorologist
Probabilistic forecasting (continued) • Another view believes that The Weather Channel should include additional information about forecast confidence or uncertainty • Still recognize the limitations and complexity of adding uncertainty information to the forecast • Maybe the place to begin is with less time-constrained platforms, like weather.com??