310 likes | 450 Views
Arctic snow in a changing cryosphere: What have we learned from observations and CMIP5 simulations?. Chris Derksen and Ross Brown Climate Research Division Environment Canada. Thanks to our data providers :
E N D
Arctic snow in a changing cryosphere:What have we learned from observations and CMIP5 simulations? Chris Derksen and Ross Brown Climate Research Division Environment Canada Thanks to our data providers: Rutgers Global Snow Lab ● National Snow and Ice Data Center ● World Climate Research Programme Working Group on Coupled Modelling ● University of East Anglia – Climatic Research Unit ● NASA Global Modeling and Assimilation Office ● European Centre for Midrange Weather Forecasting
Outline • Snow in the context of a changing cryosphere • Overview of ‘observational’ snow analyses • Validation approaches • Inter-dataset agreement • Observations versus CMIP5 simulations
Climate Change and the Cryosphere Spring snow cover Summer sea ice Sea level Observational time series IPCC AR5 Summary for Policy Makers Figure 3 Trends in surface temperature 1901–2012 IPCC AR5 WG1 Chapter 2 Figure 2.21
Arctic Sea Ice Volume • Arctic sea ice volume anomalies from the Pan-Arctic Ice Ocean Modeling and Assimilation System (PIOMAS ) • U. Washington Polar Science Center
Canadian Arctic Sea Ice Canadian Arctic Sea Ice Trends from the Canadian Ice Service Digital Archive Howell et al 2013 (updated)
Greenland Ice Sheet Mass Balance • Monthly changes in the total mass (Gt) of the Greenland ice sheet estimated from GRACE measurements. • Tedesco et al., 2013 NOAA Arctic Report Card
Arctic Ice Caps and Glaciers Mean annual (red) and cumulative (blue) mass balance from 1989-2011 from Arctic glaciers reported to the World Glacier Monitoring Service by January 2013. Sharp et al., 2013 NOAA Arctic Report Card
Cryosphere Contribution toSea Level Rise Rate of ice sheet loss in sea level equivalent averaged over 5-year periods. IPCC AR5 WG1 Figure 4.17
Arctic Terrestrial Snow Snow cover extent (SCE) anomaly time series, 1967-2013 (with respect to 1988–2007) from the NOAA snow chart CDR. Solid line denotes 5-yr running mean. • Over the 1979 – 2013 time period, NH June snow extent decreased at a rate of -19.9% per decade (relative to 1981-2010 mean). • September sea ice extent decreased at-13.0% per decade. Derksen, C Brown, R (2012) Geophys. Res. Letters
Active Layer Thickness • Active layer thickness from Siberian stations, 1950 to 2008 • IPCC AR5 WG1 Figure 4.23d
Snow – An Important Hydrological Resource NASA Earth Observatory
Snow – Highly Variable in Space and Time Focus on Arctic land areas, during the spring season (AMJ): 100% snow cover at the beginning of April; Nearly all snow gone by end of June.
Part 2: Overview of ‘observational’ snow analyses • Validation approaches • Inter-dataset agreement
Hemispheric Snow DatasetsThe challenge is not a lack of data…
Validating Snow Products with Ground Measurements • Lack of in situ observations • ‘Snapshot’ datasets • Spatial representativeness? • Measurement deficiencies • Poor reporting practices (non-zero snow depth)
Challenges to Validating Gridded Snow Products with Ground Measurements This is what product users want to see: This is the reality: n= ~5000 Time series for the former BERMS sites Spatial sampling across one grid cell
‘Validating’ Gridded Snow Products via Multi-Dataset Comparisons NH June SCE time series, 1981-2012 NOAA snow chart CDR (red); average of NOAA, MERRA, ERAint (blue) EUR Oct SCE: difference between NOAA snow chart CDR and 4 independent datasets, 1982-2005 • Tendency for NOAA to consistently map less spring snow (~0.5 to 1 million km2) than the multi-dataset average since 2007. • Accounting for this difference reduces the June NH SCE trend from -1.27 km2 x 106 to -1.12 km2 x 106 • Evidence of an artificial trend (~+1.0 million km2 per decade) in October snow cover. Brown, R Derksen, C (2013) Env. Res. Letters
A New Multi-Dataset Arctic SCE Anomaly Time Series June April May
Simulated vs. Observed Arctic SCE NOAA CDR NA EUR Historical + projected (16 CMIP5 models; rcp85 scenario) and observed (NOAA snow chart CDR) snow cover extent for April, May and June. SCE normalized by the maximum area simulated by each model. Updated from Derksen, C Brown, R (2012) Geophys. Res. Letters
Simulated vs. Observed Arctic SCE NOAA CDR Liston & Hiemstra MERRA GLDAS-Noah ERA-int Recon. NA EUR Historical + projected (16 CMIP5 models; rcp85 scenario) and multi-observational snow cover extent for April, May and June. SCE normalized by the maximum area simulated by each model.
Arctic SCE and Surface Temperature Trends: 1980-2009 SCE Tsurf • Simulations slightly underestimate observed spring SCA reductions • Similar range in observed versus simulated SCA trends • Observed Arctic temperature trends are captured by the CMIP5 ensemble range NA EUR 1. CRU 2. GISS 3. MERRA 4. ERA-int
Why do CMIP5 models underestimate observed spring SCE reductions? North America Eurasia Model vs observed temperature sensitivity (dSCE/dTs), 1981-2010 • Models exhibit lower temperature sensitivity (change in SCE per deg C warming) than observations • Magnitude of observational dSCE/dTs depends on choice of observations (both snow and temperature)
Understanding CMIP5 SCE Projections • Projected changes in snow cover for individual models are predictable based on the characteristics of historical simulations. • Consistent with a priori expectations, models project greater DSCE with: -greater standard deviation (s) -greater dSCA/dTs -stronger historical trends
Future Work • CMIP5 models do fairly good job of replicating the mean seasonal cycle of SWE over the Arctic but the maximum is higher than observations, and the models underestimate the rate of spring depletion. • Shallow snow albedo and excess precipitation frequency may together act to keep albedo higher – simulated snow melt is not ‘patchy’.
Conclusions • The rate of June snow cover extent loss (-19.9% per decade since 1979) is greater than the rate of summer ice loss (-13.0% per decade). • Arctic surface temperatures in the spring are well simulated by CMIP5 models, but they exhibit reduced snow cover extent sensitivity to temperature compared to observations. • Interannual variability (s), temperature sensitivity (dSCE/dTs), and historical trends are good predictors of DSCE projections to 2050. • The spread between 5 observational datasets (mean; variability) is approximately the same as across 16 CMIP5 models. A climate modeling group would never run one model once, and claim this is the best result. Why do we gravitate towards this approach with observational analyses?
Snow Cover Extent:Inter-Dataset Variability June snow cover extent (2002) Brown et al., 2010, J. Geophys. Res.
Observed vs. Simulated SCE Variability CMIP5 versus NOAA Liston and Hiemstra All Observations NOAA CDR Liston & Hiemstra MERRA GLDAS-Noah ERA-int Recon. NOAA CDR Liston & Hiemstra NOAA CDR • The NOAA CDR is an outlier with respect to interannual variability
Understanding CMIP5 SCE Projections Ratio of interannual variability relative to observations (NOAA) for fall and spring • The 3 models with <5% difference in variability for spring versus fall are the 3 top ranked models at reproducing the spatial pattern of mean snow cover duration over Arctic land areas (vs. NOAA)