Arctic snow in a changing cryosphere: What have we learned from observations and CMIP5 simulations? Chris Derksen and Ross Brown Climate Research Division

Embed Size (px)

Citation preview

  • Slide 1
  • Arctic snow in a changing cryosphere: What have we learned from observations and CMIP5 simulations? Chris Derksen and Ross Brown Climate Research Division Environment Canada Thanks to our data providers: Rutgers Global Snow Lab National Snow and Ice Data Center World Climate Research Programme Working Group on Coupled Modelling University of East Anglia Climatic Research Unit NASA Global Modeling and Assimilation Office European Centre for Midrange Weather Forecasting
  • Slide 2
  • Outline 1.Snow in the context of a changing cryosphere 2.Overview of observational snow analyses Validation approaches Inter-dataset agreement 3.Observations versus CMIP5 simulations
  • Slide 3
  • Observational time series IPCC AR5 Summary for Policy Makers Figure 3 Climate Change and the Cryosphere Trends in surface temperature 19012012 IPCC AR5 WG1 Chapter 2 Figure 2.21 Spring snow cover Summer sea ice Sea level
  • Slide 4
  • Arctic Sea Ice Volume Arctic sea ice volume anomalies from the Pan-Arctic Ice Ocean Modeling and Assimilation System (PIOMAS ) U. Washington Polar Science Center
  • Slide 5
  • Canadian Arctic Sea Ice Canadian Arctic Sea Ice Trends from the Canadian Ice Service Digital Archive Howell et al 2013 (updated)
  • Slide 6
  • Greenland Ice Sheet Mass Balance Monthly changes in the total mass (Gt) of the Greenland ice sheet estimated from GRACE measurements. Tedesco et al., 2013 NOAA Arctic Report Card
  • Slide 7
  • Arctic Ice Caps and Glaciers Mean annual (red) and cumulative (blue) mass balance from 1989-2011 from Arctic glaciers reported to the World Glacier Monitoring Service by January 2013. Sharp et al., 2013 NOAA Arctic Report Card
  • Slide 8
  • Cryosphere Contribution to Sea Level Rise Rate of ice sheet loss in sea level equivalent averaged over 5-year periods. IPCC AR5 WG1 Figure 4.17
  • Slide 9
  • Arctic Terrestrial Snow Over the 1979 2013 time period, NH June snow extent decreased at a rate of -19.9% per decade (relative to 1981-2010 mean). September sea ice extent decreased at-13.0% per decade. Derksen, C Brown, R (2012) Geophys. Res. Letters Snow cover extent (SCE) anomaly time series, 1967-2013 (with respect to 1988 2007) from the NOAA snow chart CDR. Solid line denotes 5-yr running mean.
  • Slide 10
  • Active Layer Thickness Active layer thickness from Siberian stations, 1950 to 2008 IPCC AR5 WG1 Figure 4.23d
  • Slide 11
  • Snow An Important Hydrological Resource NASA Earth Observatory
  • Slide 12
  • Snow Highly Variable in Space and Time Focus on Arctic land areas, during the spring season (AMJ): 100% snow cover at the beginning of April; Nearly all snow gone by end of June.
  • Slide 13
  • Part 2: Overview of observational snow analyses Validation approaches Inter-dataset agreement
  • Slide 14
  • Hemispheric Snow Datasets The challenge is not a lack of data DescriptionPeriodResolutionData Source NOAA weekly snow/no-snow1966-2013190.5 kmRutgers University, Robinson et al [1993] NOAA IMS daily 24 km snow/no-snow1997-200424 kmNational Snow and Ice Data Center (NSIDC), Ramsay [1998] NOAA IMS daily 4 km snow/no-snow2004-20134 kmNSIDC, Helfrich et al [2007] AVHRR Pathfinder daily snow/no-snow1982-20045 kmCanada Centre for Remote Sensing, Zhao and Fernandes, [2009] MODIS 0.05 snow cover fraction2000-2013~5 kmNSIDC, Hall et al [2006] ERA-40 reconstructed snow cover duration (temperature-index snow model) 1957-2002~275 km (5 km elev. adjustment) Environment Canada, Brown et al [2010] QuikSCAT derived snow-off date2000-2010~5 kmEnvironment Canada, Wang et al [2008] Daily snow depth analysis (in situ obs + snow model forced by GEM forecast temp/precip fields) 1998-2013~35 kmCanadian Meteorological Centre, Brasnett [1999] Daily snow depth analysis (in situ obs + snow model forced by reanalysis temp/precip fields) 1979-1998~35 kmEnvironment Canada, Brown et al [2003] MERRA reanalysis snow water equivalent (CATCHMENT LSM) 1979-20130.5 x 0.67 degNASA, Rienecker et al [2011] ERA-interim reanalysis snow water equivalent (HTESSEL LSM) 1979-2010~80 kmECMWF, Balsamo et al [2013] GLDAS reanalysis snow water equivalent (Noah LSM) 1948-2000 1948-2010 1.0 x 1.0 deg 0.25 x 0.25 deg NASA, Rodell et al [2004] SnowModel driven by MERRA atmospheric reanalysis snow water equivalent 1979-200910 kmColorado State, Liston and Hiemstra [2011] GlobSnow snow water equivalent (satellite passive microwave + climate station obs) 1978-201325 kmFinnish Meteorological Institute, Takala et al [2011]
  • Slide 15
  • Validating Snow Products with Ground Measurements Lack of in situ observations Snapshot datasets Spatial representativeness? Measurement deficiencies Poor reporting practices (non-zero snow depth)
  • Slide 16
  • Challenges to Validating Gridded Snow Products with Ground Measurements Time series for the former BERMS sites Spatial sampling across one grid cell This is what product users want to see: This is the reality: n= ~5000
  • Slide 17
  • Validating Gridded Snow Products via Multi-Dataset Comparisons Evidence of an artificial trend (~+1.0 million km 2 per decade) in October snow cover. EUR Oct SCE: difference between NOAA snow chart CDR and 4 independent datasets, 1982- 2005 Brown, R Derksen, C (2013) Env. Res. Letters Tendency for NOAA to consistently map less spring snow (~0.5 to 1 million km 2 ) than the multi-dataset average since 2007. Accounting for this difference reduces the June NH SCE trend from -1.27 km 2 x 10 6 to -1.12 km 2 x 10 6 NH June SCE time series, 1981-2012 NOAA snow chart CDR (red); average of NOAA, MERRA, ERAint (blue)
  • Slide 18
  • A New Multi-Dataset Arctic SCE Anomaly Time Series May June April
  • Slide 19
  • Part 3: Observations versus CMIP5 simulations
  • Slide 20
  • Historical + projected (16 CMIP5 models; rcp85 scenario) and observed (NOAA snow chart CDR) snow cover extent for April, May and June. SCE normalized by the maximum area simulated by each model. Simulated vs. Observed Arctic SCE Updated from Derksen, C Brown, R (2012) Geophys. Res. Letters NA EUR 1.NOAA CDR
  • Slide 21
  • Historical + projected (16 CMIP5 models; rcp85 scenario) and multi-observational snow cover extent for April, May and June. SCE normalized by the maximum area simulated by each model. Simulated vs. Observed Arctic SCE NA EUR 1.NOAA CDR 2.Liston & Hiemstra 3.MERRA 4.GLDAS-Noah 5.ERA-int Recon.
  • Slide 22
  • Arctic SCE and Surface Temperature Trends: 1980-2009 NA EUR SCETsurf Simulations slightly underestimate observed spring SCA reductions Similar range in observed versus simulated SCA trends Observed Arctic temperature trends are captured by the CMIP5 ensemble range 1. CRU 2. GISS 3. MERRA 4. ERA-int
  • Slide 23
  • Why do CMIP5 models underestimate observed spring SCE reductions? North AmericaEurasia Model vs observed temperature sensitivity (dSCE/dTs), 1981-2010 Models exhibit lower temperature sensitivity (change in SCE per deg C warming) than observations Magnitude of observational dSCE/dTs depends on choice of observations (both snow and temperature)
  • Slide 24
  • Understanding CMIP5 SCE Projections Projected changes in snow cover for individual models are predictable based on the characteristics of historical simulations. Consistent with a priori expectations, models project greater SCE with: -greater standard deviation ( ) -greater dSCA/dTs -stronger historical trends
  • Slide 25
  • Future Work CMIP5 models do fairly good job of replicating the mean seasonal cycle of SWE over the Arctic but the maximum is higher than observations, and the models underestimate the rate of spring depletion. Shallow snow albedo and excess precipitation frequency may together act to keep albedo higher simulated snow melt is not patchy.
  • Slide 26
  • The rate of June snow cover extent loss (-19.9% per decade since 1979) is greater than the rate of summer ice loss (-13.0% per decade). Arctic surface temperatures in the spring are well simulated by CMIP5 models, but they exhibit reduced snow cover extent sensitivity to temperature compared to observations. Interannual variability ( ), temperature sensitivity (dSCE/dTs), and historical trends are good predictors of SCE projections to 2050. The spread between 5 observational datasets (mean; variability) is approximately the same as across 16 CMIP5 models. Conclusions A climate modeling group would never run one model once, and claim this is the best result. Why do we gravitate towards this approach with observational analyses?
  • Slide 27
  • Questions?
  • Slide 28
  • June snow cover extent (2002) 2004-2008CMCIMS-24IMS-4MODISNCEPNOAAPMWQSCATAvg 1 std May Avg SCE9.011.010.69.610.211.610.2 10.3 0.80 June Avg SCE3.05.14.72.32.84.83.13.43.7 1.06 Snow Cover Extent: Inter-Dataset Variability Brown et al., 2010, J. Geophys. Res.
  • Slide 29
  • Observed vs. Simulated SCE Variability CMIP5 versus NOAA Liston and Hiemstra All Observations The NOAA CDR is an outlier with respect to interannual variability 1.NOAA CDR 2.Liston & Hiemstra 1.NOAA CDR 2.Liston & Hiemstra 3.MERRA 4.GLDAS-Noah 5.ERA-int Recon.
  • Slide 30
  • The 3 models with