570 likes | 700 Views
Dynamical Climate Reconstruction. Greg Hakim University of Washington. Sebastien Dirren , Helga Huntley , Angie Pendergrass David Battisti, Gerard Roe. Plan. Motivation: fusing observations & models State estimation theory Results for a simple model
E N D
Dynamical Climate Reconstruction Greg Hakim University of Washington Sebastien Dirren, Helga Huntley, Angie Pendergrass David Battisti, Gerard Roe
Plan • Motivation: fusing observations & models • State estimation theory • Results for a simple model • Results for a less simple model • Optimal networks • Plans for the future
Motivation • Range of approaches to climate reconstruction. • Observations: • time-series analysis; multivariate regression • no link to dynamics • Models • spatial and temporal consistency • no link to observations • State estimation (this talk) • few attempts thus far • stationary statistics
Goals • Test new method • Reconstruct last 1-2K years • Unique dataset for climate variability? • E.g. hurricane variability. • E.g. rational regional downscaling (hydro). • Test network design ideas • Where to take highest impact new obs?
Medieval warm period temperature anomalies IPCC Chapter 6
Climate variability: a qualitative approach GRIP δ18O (temperature) GISP2 K+ (Siberian High) North Swedish tree line limit shift Sea surface temperature from planktonic foraminiferals hematite-stained grains in sediment cores (ice rafting) Varve thickness (westerlies) foraminifera Cave speleotherm isotopes (precipitation) Mayewski et al., 2004
Statistical reconstructions • “Multivariate statistical calibration of multiproxy network” (Mann et al. 1998) • Requires stationary spatial patterns of variability Mann et al. 1998
Paleoclimate modeling IPCC Chapter 6
An attempt at fusion Multivariate regression Data Assimilation through Upscaling and Nudging (DATUN)Jones and Widmann 2003
Fusion Hierarchy • Nudging: no error estimates • Statistical interpolation • 3DVAR • 4DVAR • Kalman filters • Kalman smoothers } } oper NWP fixed stats } Today’s talk The curse of dimensionality looms large in geoscience
Gaussian Update analysis = background + weighted observations new obs information Kalman gain matrix analysis error covariance ‘<’ background
Ensemble Kalman Filter Crux: use an ensembleof fully non-linear forecasts tomodel the statistics of the background (expected value and covariance matrix). Advantages • No à priori assumption about covariance; state-dependent corrections. • Ensemble forecasts proceed immediately without perturbations.
Summary of Ensemble Kalman Filter (EnKF) Algorithm • Ensemble forecast provides background estimate & statistics (B) for new analyses. • Ensemble analysis with new observations. (3) Ensemble forecast to arbitrary future time.
Paleo-assimilation dynamical climate reconstruction • Observations often time-averaged. • e.g. gauge precip; wind; ice cores. • Sparse networks. • Issue: • How to combine averaged observations with instantaneous model states?
Issue with Traditional Approach Problem: Conventional Kalman filtering requires covariance relationships between time-averaged observations and instantaneous states. High-frequency noise in the instantaneous states contaminates the update. Solution: Only update the time-averaged state.
Algorithm 1. Time-averaged of background 2. Compute model-estimate of time-av obs 3. Perturbation from time mean 4. Update time-mean with existing EnKF 5. Add updated mean and unmodified perturbations 6. Propagate model states 7. Recycle with the new background states
Illustrative ExampleDirren & Hakim (2005) • Model (adapted from Lorenz & Emanuel (1998)): • Linear combination of fast & slow processes “low-freq.” “high-freq.” • LE ~ a scalar discretized around a latitude circle. • - LE has elements of atmos. dynamics: • chaotic behavior, linear waves, damping, forcing
RMS instantaneous (dashed : clim) Instantaneous states have large errors(comparable to climatology) Due to lack of observational constraint
Improvement Percentage of RMS errors RMS all means Obs uncertainty Climatology uncertainty Total state variable Averaging time of state variable Constrains signal at higher freq.than the obs themselves!
A less simple modelHelga Huntley (U. Delaware) • QG “climate model” • Radiative relaxation to assumed temperature field • Mountain in center of domain • Truth simulation • Rigorous error calculations • 100 observations (50 surface & 50 tropopause) • Gaussian errors • Range of time averages
Average Spatial RMS Error Ensemble used for control
Implications • State is well constrained by few, noisy, obs. • Forecast error saturates at climatology for tau ~ 30. • For longer averaging times, the model adds little. • Equally good results can be obtained by assimilating the observations with an ensemble drawn from climatology (no model runs required)!
Changing o (Observation Error) • Previously: • o = 0.27 for all . • Now: • o ≈ c/3 • (a third of control error).
Optimal Observation Locations • Rather than use random networks, can we devise a strategy to optimally site new observations? • Yes: choose locations with the largest impact on a metric of interest. • New theory based on ensemble sensitivity (Hakim & Torn 2005; Ancell & Hakim 2007; Torn and Hakim 2007) • Here, metric = projection coefficient for first EOF.
Ensemble Sensitivity • Given metric J, find the observation that reduces uncertainy most (ensemble variance). • Find a second observation conditional on first. • Sketch of theory (let x denote the state). • Analysis covariance • Changes in metric given changes in state + O(x2) • Metric variance
Sensitivity + State Estimation • Estimate variance change for the i’th observation • Kalman filter theory gives Ai: where • Given at each point, find largest value.
Ensemble Sensitivity (cont’d) • If H chooses a specific location xi, this all simplifies very nicely: • For the first observation: • For the second observation, given assimilation of the first observation: • Etc.
Ensemble Sensitivity (cont’d) • In fact, with some more calculations, one can find a nice recursive formula, which requires the evaluation of just k+3 lines (1 covariance vector + (k+6) entry-wise mults/divs/adds/subs) for the k’th point.
Results for tau = 20 First EOF
Results for tau = 20 • The ten most sensitive locations (without accounting for prior assimilations) • o = 0.10
Results for tau = 20 • The four most sensitive locations, accounting for previously found pts.
Results for tau = 20; o = 0.10 Note the decreasing effect on the variance.
Control Case: No Assimilation Avg error = 5.4484
100 Random Observation Locations Avg Error - Anal = 1.0427 - Fcst = 3.6403
4 Random Observation Locations Avg Error - Anal = 5.5644 - Fcst = 5.6279
4 Optimal Observation Locations Avg Error - Anal = 2.0545 - Fcst = 4.8808
Summary Percent of ctr error Assimilating just the 4 chosen locations yields a significant portion of the gain in error reduction in J achieved with 100 obs.