250 likes | 368 Views
A Hindcast Test on the Anomaly Analog Prediction System. Meteorology 485 Matt Steinbugl and Lisa Murphy April 2, 2004. To test several aspects of an objective analog forecast using varying lengths of lead time for surface based anomalies
E N D
A Hindcast Test on the Anomaly Analog Prediction System Meteorology 485 Matt Steinbugl and Lisa Murphy April 2, 2004
To test several aspects of an objective analog forecast using varying lengths of lead time for surface based anomalies Select difficult cases, mostly noted by a large change in the anomaly fields from the ‘month before’ to the predicted month. Project Goal:
October, 2002 January, 2000 March, 1992 January, 1990 September, 1988 January, 1987 January, 1982 December, 1983 January, 1977 August, 1977 March, 1971 June, 1966 Selected months to predict: Also shown are the months before anomalies to assess the role that persistence has in analog forecasting
Test One – A Single Month Lead • Gather the surface temperature and precipitation anomalies for the month prior to the one we are predicting • Use the analog-mapper page to draw the precipitation and temperature anomalies of that month with a 50% downgrade of anomalies. • Capture analog years • Use only the years that have a threshold value greater than 50% and occurred prior to the predicted year • Insert the analog years into the CDC climate divisional map interface to produce monthly temperature and precipitation anomalies for the next month (predicted one) • Compare the CDC “forecast” maps to what actually occurred that month.
Test One – A Single Month Lead • Using the analog mapper, 0.5 standard deviation was used • Palmer Drought Index was not incorporated into these results
Data Source: Analog-mapper • http://analog1.met.psu.edu/ross/climdivmap_withlags/analogmapper.html • http://analog1.met.psu.edu/ross/plotclimdata_withlags/plotmeanclimdata.html
Data Source: • The Climate Diagnostic Center web site was used to retrieve and display temperature and precipitation anomalies for the month that was being “predicted” • www.cdc.noaa.gov
Test One – Single Month Lead Composites: Verifications: Month Before: Jan 1977 Jan 1983
Test One – Single Month Lead Composites: Verifications: Month Before: Jan 1987 Mar 1971
Test One – Single Month Lead Month Before: Composites: Verifications: Mar 1992 Dec 1983
Test One – Single Month Lead Month Before: Composites: Verifications: Aug 1977 Jan 2000
Test One – Single Month Lead Month Before: Composites: Verification: Jan 1990 The ‘flip’ year proved to be the worst in being able to predict the following month
Test Two Method • Only selected anomalies that were 1 standard deviation or larger, but assigned it a value of 0.5 • Acquired years that were common to the three sets of analog years to inserted them into CDC’s climate divisional map to find the next month temperature and precipitation anomalies
Test Two – Multiple Month Lead • Using Jeremy Ross’ website to access anomalies for up to 6 months prior to the forecast month, we used a variety of anomalies to make a new “mixed” prediction
One Month • Same as the single month method, however, this time we could only choose two anomalies from the temperature and precipitation fields to acquire analog years
Three Month • A three month lead included the selection of only two anomalies from the temperature and precipitation fields to acquire analog years
Six Month • The six month lead chose only one anomaly from either the temperature or precipitation fields to acquire the analog years
Test Two: Multiple Month Lead Month Before: Composites: Verifications: Jan 1987 Jan 1977
Test Two: Multiple Month Lead Month Before: Composites: Verifications: Dec 1983 Mar 1992
Test Two: Multiple Month Lead Month Before: Composites: Verifications: Jan 1983 Jan 2000
Test Two: Multiple Month Lead Month Before: Composites: Verifications: Aug 1977 Mar 1971
Test Two: Multiple Month Lead Month Before: Composites: Verifications: Jan 1990 The ‘flip month’ proved again to be the most difficult to predict
Challenges of Test Two Method • There were too few years that were common to all 3 sets for the mixed technique • Instead used years that were common in at least 2 of the sets • Some that worked well for the one month lead technique did not for the mixed lead technique and vice versa
Assessments: • Jan 1977 and Jan 1983 did well for both techniques • Mar 1971 did worse in the mixed technique (Mar 1971 showed too much warming in the NW in the first method and too much warming all over in the mixed technique) • Jan 1983 showed too much cooling in the west in both methods • Dec 1983 and Jan 2000 did worse in the first test (Jan 2000 showed too much cooling in first method and too much warming on East coast in mixed technique. Dec 1983 showed too much warming in the SE) • Had serious problems forecasting a flip month: Jan 1990, Aug 1977, and also Sept 1988 and Jun 1966 did poorly
Future Work…1. Automate and quantify2. Create an moveable ensemble of analogs based on flexible anomaly fields3. Design an interface that will combine 10 day GFS forecasts with creating the best analog4. Downscale technique to invite the most information out