210 likes | 421 Views
The Global Coastal Ocean Modelling System. Jason Holt , James Harle, Sarah Wakelin, Sylvain Michel, Roger Proctor POL Mike Ashworth, Stephen Pickles, Crispian Batstone STFC Keith Haines, Dan Bretherton, Jon Blower ESSC
E N D
The Global Coastal Ocean Modelling System Jason Holt, James Harle, Sarah Wakelin, Sylvain Michel, Roger Proctor POL Mike Ashworth, Stephen Pickles, Crispian Batstone STFC Keith Haines, Dan Bretherton, Jon Blower ESSC Icarus Allen, Rob Holmes, Jerry Blackford, Katy Lewis PML
The GCOMS mission: Jan’06-Oct’09 • Develop a system to automatically configure and run regional shelf-sea models for any/all coastal regions around the world • Motivation: Shelf sea’s Role in Global Biogeochemical Cycles Down-scaling Climate Change Impacts Satellite estimate of primary production e.g. fish catch: Watson, R. and D. Pauly 2001. Nature 414, 534-536
The Global Coastal Ocean Modelling system • A practical solution available now: • Split the coastal seas into several domains • Deploy an automatically configured regional model in each: ~7km resolution, 40 levels • Run each independently one-way nested in a global GCM • Splits the problem into computationally tractable parts • Need 7-70 times the computational resource of the global ocean 42 domains See Holt et al 2009 Modelling the Global Coastal Ocean Phil Trans Roy. Soc. A
POLCOMS-ERSEM: a model for the Global Coastal Ocean • Technically well developed for the task: • Dynamic memory • Run-time domain decomposition • Arbitrary open boundary shape • Runs on all computer platforms • Tested in Operational Oceanography • www.metoffice.gov.uk/reasearch/ncof/mrcs/browser.html Holt and Proctor JGR 2008 European Regional Seas Ecosystem Model
The GCOMS System Validation data sources 22,000 cores 2,000 cores • Job control: • G-Rex • SSHFS • Shell scripts Global Forcing Archive 300 cores Visualisation,Validation, Post-Processing Matlab , Perl, SSHFS Compute resources Users and collaborators Domain Generator Matlab
Modes of execution Isolated domains Cluster Grid MPI job MPI job Sequential Domains b.c.’s Cluster Grid MPI job MPI job Ensembles HPC MPI job MPI job Isolated Communicating
Domain Coupling and Cross-site runs using MPIg • Experiments using MPIg on the National Grid Service, using 2 domains of similar size • On a single host, the overhead of coupling domains is small • The overhead of cross-site runs (domains are placed on different hosts and communicate using MPIg) is significant, but not preclusive • Manchester • Leeds In practice the number of cores to run both domains are readily available on the same machine
Monolithic Ensemble execution 208 processors HPCx Domain size 1 month
Load balancing by number of sea points 128 processors 256 processors • 516 and beyond – i/o reduces load balance (computation is ~82% balanced) • Some component scale with total number of points • In practice package domains into groups of 4-6
A POLCOMS G-Rex service POL Remote cluster G-Rex’s firewall port open to POL Port 9092 G-Rex client Input and output via G-Rex (HTTP) G-Rex server POLCOMS launch scripts and forcing data (same every run) POLCOMS model setup, including source code, work-flow scripts, input data and output from all runs
QUEST-FISH: A real example of GCOMS application “How climate change would affect the potential production for global fisheries resources in the future, compared to past and present scenarios, in the absence of exploitation?” NORD03 BERS15 NEBS43 NEFL07 KURO24 NWAM13 CANA30 BBEN22 INDO23 GGUI28 BENG27 HUMB10 12 model domains cover 20 Large Marine Ecosystems and >60% of global fish catch About 400M people get >50% of their animal protein from fish (FAO 2008)
Impacts of climate change on global fisheries • One Climate model and one emissions scenario • IPSL-CM4 with SRES A1B Time slices: 3yr spin-up+ 10 year production • Pre-industrial: 1864-1873 • Present day: 1992-2001 + Reanalysis forced • Near Future: 2036-2044 • Far Future: 2086-2094 That’s 780 domain years of integration! But have a high degree of parallelism e.g. could run on ~13k processors/cores in about 10 days Runs are on going…..
Status of simulations • PI - Pre-Industrial 1861-1873 • PD - Present Day • 47 - SRESA1B 2047-2059 • 82 - SRESA1B 2082-2094 • N – re-analysis 1989-2001 • These runs completed • in ~6 months • by 1.5 people including basic validation: SST, net PP. • using ~1.3M cpu hrs on a CRAY XT4
Re-analysis forced runs 1992-2001: mean PP May not be the best model of each region but they are the same model
North West Africa: Validation Sea Surface Temperature Mean RMS Error Mean Bias Error
Differences in net PP: Far-Future versus Pre-Industrial gCm-2yr-1
Key advantages of the GCOMS system • Consistent regional inter-comparisons • Ideal for forcing/parameter ensembles • Flexibly adapting to changing computational landscape: massively parallel, multi-core systems • Operable by a small number of researchers • ‘hand made’ regional models usually have 0.5-1 person each (or whole teams). • But ability to “tune” for each domain is limited
Limitations of the System • Limited sea-ice capability • Only polar-coordinates • Limited ecosystem b.c.’s • Only 1-way nesting • Trade off between • Resolution/process representation improving simulations • Boundary conditions degrading it • Not as automated or flexible as we would like • Domain definition algorithm needs refining • I/O performance bottle neck • Data handling needs work NEMO-shelf
Next steps for GCOMS The CAPRI proposal: Linking climate and socio-economic change to coastal flooding and fisheries NERC consortium bid