1 / 20

Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

Global Wind Forecasts from Improved Wind Analysis via the FSU-Superensemble suggesting possible impacts from Wind-LIDAR. Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001. Agenda. Impetus and motivation Superensemble (SE) Methodology

odelia
Download Presentation

Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Global Wind Forecasts from Improved Wind Analysis via the FSU-Superensemble suggesting possible impacts from Wind-LIDAR Adam J. O’Shay and T. N. Krishnamurti FSU Meteorology 8 February 2001

  2. Agenda • Impetus and motivation • Superensemble (SE) Methodology • Data selection and data-assimilation issue • Results of forecasts • Summary

  3. Motivation • Space-borne LIDAR could provide global modeler’s with wind data-sets useful for analysis in models • Provided with an additional wind-analysis the SE could greatly improve its global wind forecasts from http://www-lite.larc.nasa.gov/

  4. Motivation - Con’t • To show the benefit of improved wind analyses - the FSU SE was run using various multi-models’ observed data as training data - with ECMWF analysis being ground truth • Typically the FSU SE uses the FSU (ECMWF based) analyses as the training data-set as it has shown to be a superior initial state among global NWP models - instances where ‘superior initial states’ don’t always give improved forecasts do occur, although they are rare

  5. FSU Superensemble Methodology • Division of time length to two time periods • Training period - Multi-model variables are regressed towards the observed data for each model. Multiple linear regression provides weights for the individual models and time period forecasts • Forecast period (test phase) - Weights are applied to the forecasts resulting in the SE forecast for each day/time period (Krishnamurti et al. 2000)

  6. Superensemble Methodology - Con’t • This is a Multiple Linear Regression based ‘Least-Squares Minimization’ procedure, where the algorithm is based upon collective bias removal • Individual model bias removal assigns a weight of 1.0 to all (bias removed) models which results in inclusion of poor models - collective bias exhibits statistically better forecasts

  7. Superensemble Methodology - Con’t • Each SE forecast was computed for each of the forecast days (48hr and 72hr) for a global domain upon each differing training run

  8. Data Selection • Time period - 1 January 2000 - 9 April 2000 • 80 day training period - with 20 days of forecast • U,V-winds at 200 & 850mb were examined - Results at 200mb presented here for time consideration • Several global models in addition to ECMWF analysis chosen based on continuity of data-record (interpolated to 1º x 1º if necessary) • Error calculations performed for global-domain

  9. Data Assimilation point • All global NWP models do not use the same data-assimilation scheme • Multi-models examined here use the following data-assimilation schemes: • Perhaps the variation in each models’ data-assimilation accounts largely for the variation in forecasts that the SE expels when using a model other than ECMWF as t1 * This is NOT to say it is the only reason!!!

  10. Data Assimilation - Con’t • Rabier et al. (2000) and Swanson et al. (2000) found that 4-D VAR has shown overall improvement in global modeling, particularly when coupled with a high resolution model (i.e. ECMWF -- T200+) • Given the above, models using data-assimilation schemes OTHER than 4-D VAR have been used as ‘training’ for the FSU SE

  11. RMS ERRORS 48HR

  12. RMS ERRORS 72HR

  13. R(OBS,SE) = 0.70

  14. R(OBS,SE) = 0.78

  15. R(Obs,ETSE) ~ 0.5

  16. R(Obs,ETSE) ~ 0.85

  17. What is going on here? • Each of the multi-models uses a different data-assimilation scheme -- training the SE with each and obtaining the final SE product will result in disparate forecasts than if one had used ECMWF-based analysis as the training data

  18. RMS Errors of Various Models Courtesy of: http://sgi62.wwb.noaa.gov:8080/STATS/STATS.html

  19. Summary • The SE shows a degradation in skill when the training data-set originates from a model with a poorer initial state (i.e. multi-models with various data-assimilation schemes) • The heart of the SE is the training data-set - the initiation of LIDAR into the data-assimilation scheme at FSU may lead to skill in global-wind forecasts far exceeding those of present day

More Related