430 likes | 648 Views
HURRICANE RESEARCH IN THE ROCKIES: AN OVERVIEW OF RESEARCH TO OPERATIONS ACTIVITIES AT CIRA FOR 2008 / 2009. Andrea Schumacher schumacher@cira.colostate.edu. Outline. SHIPS and LGEM 2008 performance 2009 plans Monte Carlo wind probabilties Text product problem during Fay
E N D
HURRICANE RESEARCH IN THE ROCKIES:AN OVERVIEW OF RESEARCH TO OPERATIONS ACTIVITIES AT CIRA FOR 2008 / 2009 Andrea Schumacher schumacher@cira.colostate.edu
Outline • SHIPS and LGEM • 2008 performance • 2009 plans • Monte Carlo wind probabilties • Text product problem during Fay • Evaluation of GPCE test • 2009 plans • AMSU Wind Retrievals • 2008 Updates
Outline cont… • Multiplatform Surface Wind Analysis • New product running experimentally in 2009 • HDOBS • Vortex Tracker • Knaff & Zehr Wind-Pressure Relationship • Modifications for operations • TC Formation Probability Product • Overview • Verification • Planned updates • New ideas for improvement
SHIPS and LGEM for 2008 • SHIPS/LGEM predictors same as 2007 • Satellite input same as 2007 • None in LGEM • Atlantic SHIPS: GOES predictors and OHC • East Pacific SHIPS: GOES predictors • 2007 cases added to developmental sample • Includes 1982-2007 for basic mode • 1995-2007 for satellite data correction
SHIPS/LGEM Performance in 2008 Atlantic: Arthur through LauraEast Pacific: Alma through Marie (partial) Atlantic East Pacific
LGEM/SHIPS Differences • Atlantic • LGEM can better recover when storm moves from water over land and back over water • Many forecasts of this type in 2008 • East Pacific • LGEM has large low bias when initial intensity less than 30 kt • LGEM has larger low bias than SHIPS for low shear but marginal SST environments
2009 Plans for SHIPS/LGEM • Add OHC input to East Pacific SHIPS • Investigate causes of low bias for east Pacific LGEM • Test 2-predictor version of LGEM • Growth rate a function of shear and vertical instability parameter • Instability from entraining parcel model • Persistence included through assimilation of previous storm history up to the forecast time • Test impact of OHC through SST cooling algorithm on vertical instability parameter
2008 MC Wind Probabilities • Track/Intensity error distributions created for most recent 5 years • 2002-2006 replaced with 2003-2007 • Code modified to calculate input to wind speed probability table • “Capping” function added to improve inland probability estimates • Intensity can not exceed maximum value observed as a function of distance inland • Experimental version with track error distribution a function of model spread being tested • Stratify errors by GPCE values
GPCE MC Wind Probabilities Upper GPCE Tercile Lower GPCE Tercile
Test Plan for GPCE Version of MC Model • TPC visit by M. DeMaria in July 2008 • Too late to set up parallel MC model on IBM • Alternate test plan • Run GPCE MC model and operational version at CIRA • Run Atlantic cases where initial position is 1000 km or less from U.S. coastline • Sample size = 138 through Ike • Post plots on CIRA web page for qualitative evaluation • Perform quantitative verification at U.S. breakpoints • Cases should be available by end of 2008 season
MC Model for 2009 • Update error distributions for 2004-2008 • Implement GPCE or standard version depending on 2008 performance and NHC evaluation • Also test East/West Pacific cases • Fix rare problem identified by R. Berg/C. Lauer during TS Fay at Daytona Beach • P(50 kt) = 99%, P(34kt) = 87% • Problem from sampling wind model in just 4 quadrants and then azimuthally interpolating for weak storms with very large RMW and intensity near probability threshold (50 or 65 kt)
CIRA/NCEP AMSU Update • NOAA-16 Coefficients updated to account for the loss of channel 4 (15 August), fixes are questionable between March 2008 and that time because channel 4 was intermittent. • Plans continue to have fixes provided from METOP and Aqua for the 2009 hurricane season (May 2009) • Have the retrievals • Working with NCO and NHC to get the METOP and Aqua AMSU data out of the operational BUFR
Multi-platform Tropical Cyclone Surface Wind Analysis • NESDIS product • Satellite only input (QuikSCAT, ASCAT, AMSU 2-d, GOES high density winds, IR proxy winds (proxy for flight-level; Mueller et al. 2006) • Provides 6-hourly tropical cyclone structure estimates • Variational data fitting method • 900 km domain • 4km (radial) X 10 degree (azimuthal) resolution • Crude Quality control accomplished by a three step process that compares input data to the analyses using increasingly stringent thresholds • Will be run experimentally at NESDIS in 2009 • Providing 2-d analyses • ATCF fixes (RMW, R34, R50, R64, MSLP)
Example: Sep 12 2009 00ZMulti-platform TC Surf. Wind Analysis
Initiatives to Utilize HDOBS for Improved Surface Wind Analyses • Increase analysis resolution (1km x 5 deg) • Bring in surface and dropwindsondes (work with H*Wind developers) • Possible real-time vortex tracker
Real-time Vortex Tracker • Uses HDOBS, Operational best track and OFCI • Maximize the tangential wind during the period of HDOBS • Minimization accomplished by a downhill simplex algorithm that uses a quadratic representation of dx and dy • Constrained by a user supplied time increment • Produce a cubic spline for latitude and longitude as a function of time • Intervals of the spline are user determined
Example Dean 2007 HDOBS 17 Sept 22 UTC - 18 Sept 01 UTC
Modifying the Knaff & Zehr wind-pressure relationship for operations • Work with Joe Courtney (BOM) • Courtney and Knaff (2009) in preparation. • Environmental pressure, Penv = POCI +2 • Size (S) has been related to R34, where • V500=R34/9 -3, R34 is the average value of the nonzero 34kt wind radii estimates (i.e., NE, SE, SW, NW). • V500c (i.e., climatology) is given in Knaff and Zehr (2007) • S= V500/V500c , with a minimum value of 0.4 • New equation ΔP for latitudes equatorward of 18 degrees
Knaff & Zehr wind-pressure relationship cont… For use equatorward of 18 degrees latitude: For use poleward of 18 degrees latitude: Where S is size, Ф is latitude, Vsrm1 = Vmax -1.5c0.63 , and c is the translation speed in kt. Finally, MSLP=ΔP + Penv
Motivation and Approach • Motivation • The forecasting of TC formation and intensity change have been identified as high priority areas of need by NOAA • Goal: To develop an objective, probabilistic forecast guidance product for TC formation • Approach • Generalize the linear discriminant analysis (LDA) approach of Hennon and Hobgood (2003) and Perrone and Lowe (1986) • Instead of using LDA to discriminate between developing and non-developing tropical cloud clusters, combine both environmental and convective parameters into one algorithm • Combine deterministic LDA algorithm with occurrence frequencies from dependent dataset to create 24-hour TC formation probabilities
Approach • Use what we already know about TC formation (i.e., environmental and convective parameters) • Use the statistical process of linear discriminant analysis (Perrone & Lowe 1986, Hennon & Hobgood 2003, Knaff et al. 2008) • Compute 24-hour probability of TC formation over all 5 x 5 lat/lon grid boxes in domain
Approach • Use what we already know about TC formation (i.e., environmental and convective parameters) • Use the statistical process of linear discriminant analysis (Perrone & Lowe 1986, Hennon & Hobgood 2003, Knaff et al. 2008) • Compute 24-hour probability of TC formation over all 5 x 5 lat/lon grid boxes in domain “A Needle in a Haystack” Ratio of TC formation to non-formation points ~ 1:2000 Maximum climatological formation probability ~1.8% (E. Pacific)
Data • NCEP Global Model Analyses • Reanalysis 1995-1999 (2.5o grid) • Operational Analyses 2000-2005 (1.0o grid) • Geostationary Satellite Water Vapor Imagery • GOES-E 1995-2005 • GOES-W 1998-2005 • GMS-5 / GOES-9 / MTSAT-1R 2000-2005 • NHC/DOD Best Tracks 1949-2005 • Atlantic, E. Pacific, Central Pacific & W. Pacific • Subtropical and extratropical cases excluded • Unnamed depressions included since 1989
Algorithm Overview Input parameter values calculated over each 5° x 5° sub-region in domain Sub-regions for which TC formation is highly unlikely are screened out of dataset (eg. 100% over land, large vertical shear) Linear discriminant analysis is used to discriminate between TC formation and non-formation cases based on parameter values TC formation occurrence frequencies are used to translate LDA function values to probabilities
Linear Discriminant Analysis Simple schematic: 2 groups, 2 attributes LDA seeks to find the vector a (i.e.,direction) that maximizes the separation of the two means, in standard deviation units, when the data are projected onto a. a Mathematically: LDA maximizes by solving for where *Schematics from UCLA DOE, http://www.doe-mbi.ucla.edu/~parag/multivar/dawords.htm
Algorithm-Derived LDA Coefficients Largest contributors
Verification Reliability Diagrams ROC Skill Scores
2008 Season PerformanceAt A Glance Tropical Atlantic Caribbean E. Pacific
Hurricane Dolly TS Eduouard Hurricane Kyle Hurricane Ike & TS Josephine
Future Work • Expand domain to include S. Hemisphere & Indian Ocean • Extend forecast period from 24 h to 48 h + • Use GFS forecast fields • Analyze global water vapor strip to identify upstream predictors, particularly convective signatures associated with tropical waves (Frank and Roundy 2006) • Develop a disturbance-centric algorithm • TAFB invest locations and T-Numbers (D. Brown) • Include current TCFP predictors, SHIPS predictors, TPW Eg. Full-domain water vapor strip, 2 July 18 Z – 3 July 15Z 2008