1 / 31

Global Hydrology Modelling and Uncertainty: Running Multiple Ensembles with the University of Reading Campus Grid

Global Hydrology Modelling and Uncertainty: Running Multiple Ensembles with the University of Reading Campus Grid. Simon Gosling 1 , Dan Bretherton 2 , Nigel Arnell 1 & Keith Haines 2 1 Walker Institute for Climate System Research, University of Reading

rodney
Download Presentation

Global Hydrology Modelling and Uncertainty: Running Multiple Ensembles with the University of Reading Campus Grid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Global Hydrology Modelling and Uncertainty: Running Multiple Ensembles with the University of Reading Campus Grid Simon Gosling1, Dan Bretherton2, Nigel Arnell1 & Keith Haines2 1 Walker Institute for Climate System Research, University of Reading 2 Environmental Systems Science Centre (ESSC), University of Reading

  2. Outline • Uncertainty in climate change impact assessment • The NERC QUEST-GSI project & requirement for HTC • Modification to the CC impact model & Campus Grid • Results: impact on global river runoff & water resources • Conclusions & future developments

  3. Uncertainty in Climate Change Impact Assessment

  4. Uncertainty in climate change impact assessment • Global climate models (GCMs) use different but plausible parameterisations to represent the climate system. • Sometimes due to sub-grid scale processes (<250km) or limited understanding.

  5. Uncertainty in climate change impact assessment • Therefore climate projections differ by institution: 2°C

  6. The NERC QUEST-GSI Project and the Requirement for HTC

  7. The NERC QUEST-GSI project • Overall aim: To examine and assess the implications of different rates and degrees of climate change for a wide range of ecosystem services across the globe • Our specific aims for global hydrology & water resources: A) To assess the global-scale consequences of different degrees of climate change on river runoff and water resources B) To characterise the uncertainty in the impacts associated with a given degree of climate change

  8. The NERC QUEST-GSI project • A) achieved by investigating impacts associated with the following 9 degrees of global warming relative to present: 0.5 1.0 1.5 2.0 2.5 3.0 4.0 5.0 6.0ºC • B) achieved by exploring impacts with the climate change patterns associated with 21 different GCMs (climate model structural uncertainty) • Assessed impacts by applying above climate change scenarios to the global hydrological model (GHM) Mac-PDM.09 • A global water balance model operating on a 0.5°x0.5° grid • Reads climate data on precipitation, temperature, humidity, windspeed & cloud cover for input

  9. The challenge Prescribed Temperature • Running on Linux Desktop: • 1 run = 4 hours • 1st Priority runs • 9 runs = 36 hours • 2nd & 3rd Priority runs • 63 runs = 252 hours (~11 days) • 4th Priority runs • 189 runs = 756 hours (~32 days) • Running on Campus Grid: • 189 runs = 9 hours 0.5 1.0 1.5 2.0 2.5 3.0 4.0 5.0 6.0 UKMO HadCM3 1 1 1 1 1 1 1 1 1 CCCMA CGCM31 3 3 3 2 3 3 3 3 3 IPSL CM4 3 3 3 2 3 3 3 3 3 MPI ECHAM5 3 3 3 2 3 3 3 3 3 NCAR CCSM30 3 3 3 2 3 3 3 3 3 UKMO HadGEM1 3 3 3 2 3 3 3 3 3 CSIRO MK30 3 3 3 2 3 3 3 3 3 CCSR MIROC32HI 4 4 4 4 4 4 4 4 4 GCM used to provide climate data CCSR MIROC32MED 4 4 4 4 4 4 4 4 4 CNRM CM3 4 4 4 4 4 4 4 4 4 GFDL CM21 4 4 4 4 4 4 4 4 4 GISS MODELEH 4 4 4 4 4 4 4 4 4 GISS MODELER 4 4 4 4 4 4 4 4 4 INM CM30 4 4 4 4 4 4 4 4 4 MRI CGCM232A 4 4 4 4 4 4 4 4 4 GFDL CM20 4 4 4 4 4 4 4 4 4 NCAR PCM1 4 4 4 4 4 4 4 4 4 BCCR BCM20 4 4 4 4 4 4 4 4 4 CCCMA CGCM31T63 4 4 4 4 4 4 4 4 4 GISS AOM 4 4 4 4 4 4 4 4 4 CSIRO MK5 4 4 4 4 4 4 4 4 4

  10. Modifications to Mac-PDM.09 and the Campus Grid

  11. Modifications to MacPDM.09 • Climate change scenarios previously downloaded from Climatic Research Unite (CRU) at UEA and re-formatted to be compatible with Mac-PDM.09 • Around 800Mb of climate forcing data needed for 1 Mac-PDM.09 simulation • Therefore ~160GB needed for 189 simulations • Integrated ClimGen code within Mac-PDM.09 as a subroutine to avoid downloading • Ensured all FORTRAN code was compatible with the GNU FORTRAN compiler • But the large data requirements meant the Campus Grid storage was not adequate…

  12. Campus Grid data management • Total Grid storage only 600GB, shared by all users; 160GB not always available. • Solution chosen was SSH File System (SSHFS - http://fuse.sourceforge.net/sshfs.html) • Scientist’s own file system was mounted on Grid server via SSH. • Data transferred on demand to/from compute nodes via Condor’s remote I/O mechanism.

  13. Campus Grid data management (2) Using SSHFS to run models on Grid with I/O to remote file system Campus Grid Grid server Large file system Remote FS mounted using SSHFS Data transfer via SSH ... Scientist’s data server in Reading Data transfer via Condor Grid storage, not needed

  14. Campus Grid data management (3) SSHFS advantages: • Model remained unmodified, accessing data via file system interface. • It is easy to mount remote data with SSHFS, using a single Linux command.

  15. Campus Grid data management (4) Limitations of SSHFS approach • Maximum simultaneous model runs was 60 for our models, implemented using a Condor Group Quota • Can submit all jobs, but only 60 allowed to run simultaneously. • Limited by Grid and data server CPU load (Condor load and SSH load) • Software requires sys.admin. to install. • Linux is the only platform

  16. Campus Grid data management (5) Other approaches tried and failed • Lighter SSH encryption (Blowfish) • No noticeable difference in performance • Models work on local copies of files • Files transferred to compute nodes before runs • Resulted in even more I/O for Condor • Jobs actually failed • Mount data on each compute node separately • Jobs failed because data server load too high

  17. ResultsGlobal Average Annual Runoff

  18. Multiple ensembles for various prescribed temperature changes 18 model runs 81 model runs 9 model runs

  19. The ensemble mean Global Average Annual Runoff Change from Present (%) But what degree of uncertainty is there?

  20. Uncertainty in simulations Number of models in agreement of an increase in runoff

  21. ResultsCatchment-scale Seasonal Runoff The Yangtze The Liard The Okavango

  22. Seasonal Runoff Agreement of increased snow-melt induced runoff Large uncertainty throughout the year Less certainty regarding wet-season changes Agreement of dry-season becoming drier

  23. ResultsGlobal Water Resources Stresses

  24. Calculating stresses • A region is stressed if water availability is less than 1000m3/capita/year • Therefore stress will vary according to population growth and water extraction: • Stress calculated for 3 populations scenarios in the 2080s • SRES A1B • SRES A2 • SRES B2 • Calculated for different prescribed warming (0.5-6.0ºC)

  25. Global water resources stresses Global Increase in Water Stress with 2080s A1B Population

  26. The range of uncertainty Global Increase in Water Stress with 2080s A1B Population

  27. Conclusions • HTC on the Campus Grid has reduced total simulation time from 32 days to 9 hours • This allowed for a comprehensive investigation of climate change impacts uncertainty • Previous assessments have only partly addressed climate modelling uncertainty • e.g. 7 GCMs for global runoff • e.g. 21 GCMs for a single catchment (we looked at 65,000) • Results demonstrate: • GCM structure is a major source of uncertainty • Sign and magnitude of runoff changes varies across GCMs • For water resources stresses, population change uncertainty is relatively minor

  28. Further developments • Several other simulations have just been completed on the Campus Grid & are now being analysed: • NERC QUEST-GSI project: • 204-member simulation • 3 future time periods, 4 emissions scenarios, 17 GCMs (3x4x17=204) • 816 hours on Linux Desktop - 10 hours on Campus Grid • AVOID research programme (www.avoid.uk.net) • Uses climate change scenarios included in the Committee on Climate Change report • 420-member simulation • 4 future time periods, 5 emissions scenarios, 21 GCMs (4x5x21=420) • 70 days on Linux Desktop – 24 hours on Campus Grid • 1,000-member simulation planned to explore GHM uncertainty

  29. Further developments Forcing repositories at other institutes • Forcing = hydrological model input • Avoid making local copies in Reading • Additional technical challenges: • Larger volume of data (GCMs not run locally) • Slower network connections (for some repos.) • Sharing storage infrastructure with more users • No direct SSH access to data

  30. Further developments • Possible solutions • Mount repos. on compute nodes with Parrot (http://www.cse.nd.edu/~ccl/software/parrot) • This technique is used by CamGrid • Parrot talks to FTP, GridFTP, HTTP, Chirp + others • No SSH encryption overheads • May need to stage-in subset of forcing data before runs • Options include Stork (http://www.storkproject.org/)

  31. Acknowledgements The authors would like to thank David Spence and the Reading Campus Grid development team at the University of Reading for their support of this project. Thank you for your time Visit www.walker-institute.ac.uk

More Related