230 likes | 246 Views
The PRISM infrastructure for Earth system models. Eric Guilyardi, CGAM/IPSL and the PRISM Team. Background and drivers PRISM project achievements The future. Thematically. Why a common software infrastructure ?. Earth system modelling e xpertise widely distributed. Geographically.
E N D
The PRISM infrastructure for Earth system models Eric Guilyardi, CGAM/IPSL and the PRISM Team • Background and drivers • PRISM project achievements • The future
Thematically Why acommon software infrastructure ? • Earth system modelling expertise widely distributed Geographically
Why acommon software infrastructure ? • Earth system modelling expertise widely distributed • Scientific motivation =facilitate sharing of scientific expertise and of models • Technical motivation= the technical challenges are large compared with available effort • Need to keep scientific diversity while increasing efficiency – scientific and technical • Need for concerted effort in view of initiatives elsewhere: • The Frontier Project, Japan • The Earth System Modelling Framework, US
PRISM concept « Share Earth System Modelling software infrastructure across community » To: • share development, maintenance and support • aid performance on a variety of platforms • standardize model software environment • ease use of different climate model components
Expected benefits • high performance ESM software, developed by dedicated IT experts, available to institutes/teams at low cost: • - helps scientists to focus on science • - helps key scientific diversity (survival of smallers groups) • Easier toassemble ESMs based on community models • shared infrastructure = increased scientific exchanges • computer manufacturers inclined to contribute: • - efficiency (porting, optimisation) on variety of platforms • - next generation platforms optimized for ESM needs • - easier procurements and benchmarking • - reduced computing costs
Running environment Share (i.e. f90) Software structure of an Earth System Model Coupling infrastructure Scientific core Supporting software
Tomorrow Earth System model (Science) Standard support Library (incl. Env.) Fortran Compiler Hardware The long term view Towards standard ESM support library(ies) Climate science work Today Earth System model (Science + support + environment) Modeller IT expert Fortran Compiler Hardware
The PRISM project • Program for integrated Earth System Modelling • 22 partners • 3 Years, from Dec 2001 - Nov 2004 • 5 Mill. € funding, FP5 of the EC (~80 py) • Coordinators: G.Brasseur and G.Komen
System specifications The modelers/users: - requirements - beta testing - feedback The science : - General principles - Constraints from physical interfaces,… PRISM infrastructure The technical developments: The community models - Coupler and I/O - Compile/run environment - GUI - Visualisation and diagnostics - Atmosphere - Atmos. Chemistry - Ocean - Ocean biogeochemistry - Sea-ice - Land surface - … Let’s NOT re-invent the wheel !
System specifications - the people Reinhard Budich - MPI, Hamburg Andrea Carril - INGV, Bologna Mick Carter - Hadley Center, Exeter Patrice Constanza - MPI/M&D, Hamburg Jérome Cuny - UCL, Louvain-la-Neuve Damien Declat - CERFACS, Toulouse Ralf Döscher - SMHI, Stockholm Thierry Fichefet - UCL, Louvain-la-Neuve Marie-Alice Foujols- IPSL, Paris Veronika Gayler - MPI/M&D, Hamburg Eric Guilyardi* - CGAM, Reading and LSCE Rosalyn Hatcher - Hadley Center, Exeter Miles Kastowsky MPI/BCG, Iena LuisKornblueh- MPI, Hamburg Claes Larsson - ECMWF, Reading Stefanie Legutke - MPI/M&D, Hamburg Corinne Le Quéré - MPI/BCG,Iena Angelo Mangili - CSCS, Zurich Anne de Montety - UCL, Louvain-la-Neuve Serge Planton - Météo-France, Toulouse Jan Polcher - LMD/IPSL, Paris René Redler, NEC CCRLE, Sankt Augustin Martin Stendel- DMI, Copenhagen Sophie Valcke - CERFACS, Toulouse Peter van Velthoven- KNMI, De Bilt Reiner Vogelsang- SGI, Grasbrunn Nils Wedi - ECMWF, Reading * Chair
PRISM achievements (so far): • Software environment (the tool box): • a standard coupler and I/O software, OASIS3 (CERFACS) and OASIS4 • a standard compiling environment (SCE) at the scripting level • a standard running environment (SRE) at the scripting level • a Graphical User Interface (GUI) to the SCE (PrepIFS, ECMWF) • a GUI to the SRE for monitoring the coupled model run (SMS, ECMWF) • standard diagnostic and visualisation tools • Adaptation of community Earth System component models (GCMs) and demonstration coupled configurations • A well co-ordinated network of expertise • Community buy-in and trust-building
Outer shells Standard Running Environment Standard Compile Environ. PSMILe (coupling and I/O) Scientific core Historic I/O Inner shell The PRISM shells
+ PSMILe + PMIOD SRE SCE Levels of adaptation User Interface Adapting Earth System Components to PRISM PRISM Model Interface Library Potential Model IO Description
SRE Driver User Interface SCE Binary executables Transf. disks Configuration management and deployment
PrepIFS/SMS Web services Driver Driver Driver Deploy Transf. Transf. Transf. Instrumented sites Architecture A Architect. B Architect. C PRISM GUI remote functionality Config. PRISM Repositories (CSCS, MPI) Internet User
Standard Runtime Environment SRE Standard scripting environments • Standard Compiling EnvironmentSCE
CGAM contribution (Jeff Cole) Demonstration experiments Platforms Assembled Coupled models
Development coordinators • The coupler and I/O - Sophie Valcke (CERFACS) • The standard environments - Stephanie Legutke (MPI) • The user interface and web services - Claes Larsson (ECMWF) • Analysis and visualisation - Mick Carter (Hadley Centre) • The assembled models - Stephanie Legutke (MPI) • The demonstration experiments - Andrea Carril (INGV)
Community buy-in • Growing ! • Workshops and seminars • 15 pionneer models adapted (institutes involvement) • 9 test super-computers intrumented • Models distributed under PRISM env. (ECHAM5, OPA 9.0) • Community programmes relying on PRISM framework (ENSEMBLES, COSMOS, MERSEA, GMES, NERC,…) • To go further: • PRISM perspective: maintain and develop tool box • Institute perspective: get timing and involvement in next steps right
Running environment Coupling infrastructure Scientific core Supporting software Collaborations • Active collaborations: • ESMF(supporting software, PMIOD, MOM4) • FLUME(PRISM software) • PCMDI (visualisation, PMIOD) • CF group (CF names) • NERC (BADC & CGAM) (meta-data, PMIOD) • M&D, MPI (data) • Earth Simulator (install PRISM system V.0) PRISM has put Europe in the loop for community-wide convergence on basic standards in ES modelling
Set-up meeting held in Paris Oct 27 2004 The future • PRISM has delivered a tool box, a network of expertise and demonstrations • Community buy-in growing • Key need for sustainability of • tool box maintenance/development (new features) • network of expertise • PRISM sustained initiative