340 likes | 476 Views
Gabriele Hegerl, GeoSciences, University of Edinburgh. Deriving observational constraints on climate model predictions. The Problem. Climate model predictions are uncertain, and quantifying these uncertainties is essential for useful predictions
E N D
Gabriele Hegerl, GeoSciences, University of Edinburgh Deriving observational constraints on climate model predictions
The Problem • Climate model predictions are uncertain, and quantifying these uncertainties is essential for useful predictions • Only observations can really constrain predictions – so attempts to arrive at probabilistic predictions make use of observations in some form • There are a number of ways to do that, depending on the problem, timescale, information available, climate variable….
Prediction uncertainty • Internal variability uncertainty: weather/climate variability not/not entirely predictable beyond days => uncertainty, can be estimated • Forcing uncertainty: Future emissions unknown – scenarios; volcanoes? Sun? • Model uncertainty: uncertainty due to unknown physics and unknown parameters in models, structural errors, missing processes… • 3 is mainly what we try to estimate, although some recent work also tries to predict 1
Forcing uncertainty, model uncertainty and internal climate variability also vary with timescale • From Hawkins and Sutton, 2009 BAMS: fraction of uncertainty due to climate variability, model uncertainty, forcing uncertainty and model uncertainty
How predictons are constrained depends on timescale • Nearterm: initial conditions MAY matter • Intermediate transient warming • Longterm: Equilibrium climate sensitivity From IPCC AR4, CH10 (Meehl et al.)
Why? A very simple Energy Balance Model: Held et al., 2010 • Total radiative forcing G • Surface Temperature change T • Change in outgoing radiation bT • Exchange of heat into deep ocean H • Equation for surface ocean with heat capacity CF • Equation for deep ocean with heat capacity CD cF H cD
There are two distinct timescales Fast response: deep ocean has not yet significantly taken up heat (TD)=0 Response time: Dominated by transient climate response timescale for the deep ocean is much slower. Equilibrium climate sensitivity reached once ocean takes up no more heat • This works well for the GFDL model, with transient near term warming almost completely dominated by the first case cF H cD
Lets start with the transient climate change in the 21rst century • ‘naïve approach’: models that do well over the 20th century will do well over the future • But: external forcings driving this are uncertain… • Greenhouse gas forcing is quite well known, but sulfate aerosol forcing, other anthropogenic forcing (BC etc) poorly, solar not well know either • models can agree with data because they are correct, but they could also agree because of cancelling errors • We may also wind up rejecting models that are correct but their forcing was wrong…(CMIP5)
El Chichón, 1982 Pinatubo, 1991 What we need to do is: • Identify what is the response to individual forcings that influenced 20th century climate • Project forward the components that are predictable or projectable (eg greenhouse gas increases) • Fingerprints can separate the contribution by different external drivers because of different physics of forcing • Eg: solar warms entire atmosphere • Aerosols have different spatial pattern and temporal evolution than greenhouse gases • Volcanoes have pronounced shortterm impact
Ingredients for detection and attribution • Observation y • Climate change signals (“fingerprints”) X=(xi),i=1..n typically from model simulation: one for each fingerprint, or for a small number of combinations • Noise: data for internal climate variability, usually from a long model control simulation • If X contains realizations of climate variability, a total least square fit can be used (Allen and Stott)
Observed amplitude estimate Signal amplitude scalar product can reduce noise if using inverse noise covariance Uncertainty in determined by superimposing samples of climate variability onto fingerprint (bit more complicated for tls) Observational uncertainty: • Use model data only where observations exist – like with like • Use samples of observations
Attribution results yields range of scaling factors that are consistent with observed change • Scaling factors that show which range of up-or downscaling of model response is consistent with observations • Warming due to greenhouse gases Greenhouse gases; other anthropogenic, natural (solar+volcanic) Fig. 9.9c, Hegerl et al., 2007
This translates directly to transient climate response: • Estimated warming at the time of CO2 doubling in response to a 1% per year increase in CO2 • Constrained by observed 20th century warming via the estimated greenhouse gas signal Fig 9.21 after Stott et al. Observational constraints suggest … • very likely >1°C • very unlikely >3.5°C • supports the overall assessment that TCR very unlikely >3°C
Probabilistic prediction of global mean change: one of the pdfs is based on TCR, others on other obs. constraints • Scalability: the pattern does not change much with signal strength
Other approaches: select models based on observational feedbacks • Hall et al., 2006: spring albedo against temperature change vs seasonal cycle • Using information from climate model ‘quality’ ie ability to simulate mean climate and short term variations used particularly for regional scales – needs to be demonstrated that it is relevant for predictions • Use of observations: compare like with like; sample as observations do, not models do; bring models to observations
Can we get closer with initial conditions for the near term: • Can climate change be predicted based on initial condition just like weather? => initial value problem • Initialization is not easy • Large ensemble of such runs done right now for nearterm • Problem: evidence for useful predictability beyond a year or few is weak, particularly for things that matter (regional climate) Smith et al., 2006, science Top: 1yr, middle 9 yr, bottom ave 1-9 yrs, 5-95% ranges
EQUIP: End to End Quantification of Impacts Prediction Heatwaves: • Summer maximum daily temperature – predictions capture not only trend, but some of the structure (is this just plain lucky?) Hanlon and Hegerl, in prep. • EQUIP lead by Andy Challinor, Leeds • Other work: Crop predictions, Water deficit, fisheries
How to • First: Bias correct the model • Needs to be done differently for hot extremes than mean • Is there any added value from initial conditions? – not clear based on correlation
Use skill score based on Murphy 1988 • Forecast system Y vs W (eg climatology, noninitialized) • MSSS can be composed into correlation, conditional bias and mean bias • Updated to compare against NoAssim (no ICs) rather than persistence (Goddard et al., in prep)
Decadal predictions raise many questions • Is there skill that is statistically significant • How can the prediction be quantified – arrive at uncertainty ranges • How long does skill last • So far: predictability largely for fish…
Approach for long term: Estimating climate system properties • Equilibrium climate sensitivity • First: example single line of evidence
1. What can we learn about climate sensitivity from the last millennium? Decadal NH 30-90N land temperature; Hegerl et al., J Climate, 2007
CH-blend reconstruction • Weighted average of decadal records, many treering data (RSS) • Calibration: Total least square scales communality between instrumental and proxy data to same size • Method tested with climate model data to assess if uncertainties estimated properly
Climate forcing over the last millennium Northern Hemispheric 30-90N mean radiative forcing (decadally smoothed) from Crowley Uncertainties: ~ 40% in amplitude of volcanic forcing Large in amplitude and shape of solar forcing And in aerosol forcing
Estimating equilibrium climate sensitivity • Simulate observed climate change not with a single best fit, but a large ensemble of model simulations with different sensitivities • Determine probability of models in agreement with data, given: internal variability, uncertainty in data, uncertainty in model • Miss uncertainties: too narrow • Use information incompletely: too wide p ECS [K]
Estimating ECS Run EBM with > 1000 model simulations, varying ECS, effective ocean diffusivity, and aerosol forcing Residuals between reconstruction and range of EBM simulations with different climate sensitivity Var(Res-resmin ) ~ F(k,l) (after Forest et al., 2001) Uncertainties included: • Calibration uncertainty of reconstruction • Noise and internal variability • Uncertainty in magnitude of past solar and volcanic forcing Uncertainties: Simple representation of efficacy Systematic biases in reconstructions
Estimated PDF for climate sensitivity Larger amplitude Smaller forcing Nonlinear relationship sensitivity – volcanic cooling Response small ~ climate variability Result for different reconstructions, 13th century to 1850
2. Multiple lines of evidence Bayesian update, using a prior pdf based on late 20th century (Frame et al) Hegerl et al. 06, nature Multiple lines of evidence reduce probability of high sensitivity, and of very small sensitivity Independence? Proper error estimate?
Estimates from many different sources • Difficult question: can these estimates strengthen each other? How? Knutti and Hegerl, 2008
Towards improving use of observations for constraints on predictions • Make sure that the model is brought as closely to the observational product as possible: synthetic satellite data; synthetic palaeo data • Use uncertainty estimates; models can be used to test processing uncertainty • Be wary of spatial and temporal autococrrelation
If using observations to provide constraints on predictions • Relevance of observed evidence for prediction needs to be established • Uncertainty in observations needs to be included in estimate • Approach will vary with timescale Many questions remain: • Decadal predictions with reasonable uncertainty • How to predict regional changes? • How to combine information from different sources into an overall estimate of uncertainty of ECS
Future change uncertainty ranges reflect uncertainty in transient response • Are based, among other things, on observational constraints • This is a significant advance Likely (>66%) range SPM Fig. 5