Transcript Document

The impacts of climate change on global hydrology and water resources
Simon Gosling and Nigel Arnell, Walker Institute for Climate System Research, University of Reading
Dan Bretherton and Keith Haines, Reading e-Science Centre, University of Reading
Summary of current research
•Explores how uncertainties associated with climate change propagate through to estimated changes in the global hydrological cycle and water resources stresses with
climate change.
•The application of both pattern-scaling and High Throughput Computing (HTC) is central to the research.
Changes in the global hydrological cycle
Different modelling institutes use different plausible representations of the climate system within their global climate models (GCMs), giving a range of climate projections for
a single emissions scenario. A method of accounting for this “climate model structural uncertainty” in climate change impacts assessment is to use this range of projections
from the ensemble of plausible GCMs, to produce an ensemble of impacts projections. First the patterns of climate change from each of 21 GCMs are identified associated
with globally averaged warmings of 0.5, 1.0, 1.5, 2.0, 2.5, 3.0, 4.0, 5.0 and 6.0°C (relative to 1961-1990), 189 patterns in all. These patterns are pre-generated from the
GCM output by the ClimGen model developed at UEA. This represents a large reduction of input data for the future impacts modelling, although the approach fairly assumes
that the pattern of climate change simulated by GCMs is relatively constant (for a given GCM) under a range of rates and amounts of global warming. Nevertheless, the use
of GCM output directly in the future would avoid this assumption. These patterns are then applied to Mac-PDM.09, a global hydrological model (GHM). In order to run the
ensemble of GHM with different patterns of climate change a Grid computing solution is used. Figure 1 shows some of the results.
Figure 1.
(A) Ensemble-mean
change from present in
average annual runoff.
(B) Number of GCMs
showing an increase in
average annual runoff.
Both are for a globalaverage temperature
rise of 2ºC.
Effects on water resources
A water-resources model, which assumes that watersheds with
<1000m3/capita/year are water-stressed, is used to assess global water
resources stresses, for different assumptions of future population change and
global warming. Figure 2 shows some of the results.
Running models on the Reading Campus Grid
Figure 2. Percentage
of the global
population that
experiences
increases in water
stress for different
degrees of global
warming.
The Grid is a Condor Pool (http://www.cs.wisc.edu/condor) containing 200-500
library & lab computers.
•Campus Grid enables many model runs to take place simultaneously. This is an example of High
Throughput Computing.
•Reduced time taken for 189 runs from 32 days (single computer) to 9 hours
2 main challenges in running GCMs + MAC-PDM models on the Grid:
•1 Required to make minimal changes to models for Grid execution.
•2 Large amount of input and output. 160GB storage required for 189 runs.
This would increase greatly if GCM forcing were used directly for the GHM simulations
Total Grid storage only 600GB, shared by all users; 160GB not always available. Solution chosen was SSH
File System (SSHFS - http://fuse.sourceforge.net/sshfs.html) , as shown in Figure 3.
•Scientist’s own file system was mounted on Grid server via SSH. Data transferred on demand to/from
compute nodes via Condor’s remote I/O mechanism.
•Model remained unmodified, accessing data via file system interface.
•It is easy to mount remote data with SSHFS, using a single Linux command.
Figure 3. Using SSHFS to run models on Grid with I/O to remote file system
Campus Grid
Grid server
...
Remote FS
mounted
using
SSHFS
Grid storage, not
needed
Data transfer via SSH
Large
file
system
Scientist’s data
server in Reading
SSHFS limitations and alternatives
•Maximum simultaneous model runs was 60 for our models,
implemented using a Condor Group Quota. This allowed us to
submit all jobs, but only 60 were allowed to run simultaneously.
•Limited by Grid and data server load.
•Need SSH access to data server, not always possible for other
institutes’ data.
•Software requires sys.admin. to install.
We are now experimenting with Parrot
http://www.cse.nd.edu/~ccl/software/parrot following earlier
work by CamGrid at the University of Cambridge. Parrot is
another way to mount remote data. It talks to HTTP, FTP,
GridFTP and other remote I/O services, so SSH access to data
is not required.
Further work
We would like to run the hydrological model with climate data forcing stored in repositories at various different institutes. Running climate simulations locally would not then
be necessary, but the amount of data transfer involved would be much larger. We would like the running models to access the forcing repositories directly, to avoid storing
copies of all the forcing data sets locally. Data transfer would then be over much larger distances, with slower network connections in some cases. Current e-research effort
is focussed on these challenges, and we also plan to apply the techniques to other models and to other grids.