1 / 33

PRISM

ESMF 3 rd Community Meeting, Boulder, July 15, 2004. PRISM. An Infrastructure Project for Climate Research in Europe . by Nils Wedi @ ECMWF

niel
Download Presentation

PRISM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ESMF 3rd Community Meeting, Boulder, July 15, 2004 PRISM An Infrastructure Project for Climate Research in Europe by Nils Wedi @ECMWF Contributions by A. Caubel, P. Constanza, D. Declat, J. Latour, V. Gayler, E. Guilyardi, C. Larsson, S. Legutke, R. Redler, H. Ritzdorf, T. Schoenemeyer, S. Valcke, R. Vogelsang and many others in the PRISM community ...

  2. Overview • What is PRISM? • European Interest, partners • PRISM Objectives • PRISM Model Components • PRISM Approach • Architecture and User Interface • Current Status and Perspective

  3. What is PRISM? • PRogram for Integrated Earth System Modelling • A European project for climate modelling involving 22 partners, 12/2001 – 12/2004 • Funded by the European Commission (4.8 M€) • Involves state-of-the-art atmosphere, ocean, sea-ice, atmospheric chemistry, land-surface and ocean-biogeochemistry models

  4. PRISM partners • CSCS/ETH, Switzerland • INGV, Italy • MPI-BGC, Germany • PIK, Germany • ECMWF • UCL-ASTR, Belgium • NEC-ESS, Germany • FECIT/Fujitsu, France • SGI, Germany • SUN, Germany • NEC-CCRLE, Germany • MPI-M, Germany (Guy Brasseur, coordinator) • KNMI, The Netherlands (Gerbrand Komen, co-coordinator) • MPI-M&D, Germany • MetOffice, United Kingdom • UREADMY, United Kingdom • IPSL, France • Météo-France, France • CERFACS, France • DMI, Denmark • SHMI, Sweden • NERSC, Norway

  5. Help scientists to spend more time on science Provide software infrastructure to • easily assemble earth system coupled models based on existing state-of-art components models • launch/monitor complex/ensembles earth system simulations • access, analyse and share results across wide community Define and promote technical and scientific standards for Earth System modelling

  6. Technical and scientific standards • Scientific: Global parameters Physical interfaces • Technical: Coupler and I/O Data format and grids Architecture and User Interface Diagnostics and visualization Coding and quality

  7. Atmosphere: Météo-France (ARPEGE), MPG-IMET(ECHAM), IPSL (LMDZ), MetOffice (Unified Model), UREADMY, INGV Atmospheric Chemistry: MPG-IMET, UREADMY, IPSL, MetOffice, Météo-France, KNMI Land Surface: IPSL (Orchidée), MetOffice, MPG-IMET, UREADMY, Météo-France (ISBA) Coupler: CERFACS,NEC, CCRLE, FECIT, SGI, MPI-MAD Regional Climate: SHMI, DMI, MetOffice Sea Ice: NERSC,UCL-ASTR, MetOffice, IPSL, MPG-IMET Ocean Biogeochemistry: MPI-BGC,IPSL, MPG-IMET, MetOffice Ocean: UREADMY, MetOffice (FOAM), MPI-M (HOPE), IPSL (OPA/ORCA) PRISM model components

  8. ESMF - PRISM Running environment PRISM Superstructure User code ESMF Code Infrastructure

  9. Coupling software and its evolution in PRISM • Coupler: OASIS 3.0/4.0 (~10 years experience) • Prism system model interface library: PSMILe • MPI1 or MPI2 direct communication between models with same grid otherwise repartitioning using a transformer • modularity: prism_put() and prism_get() to implement in existing models

  10. OASIS coupler:OceanAtmosphereSeaIceSoil • Historical review: Developed since 1991 in CERFACS to couple existing GCMs. At the time: • Models at relatively low resolution (~10000-20000 pts) • Small number of 2D coupling fields (~10) • Low coupling frequency (~once/day) • flexibility was very important, efficiency not so much! 19912001 |-- |--- PRISM  OASIS 1  OASIS 2  OASIS3   OASIS4 

  11. OASIS community • CGAM-Reading (UK) HadAM3 - ORCA2 • Southampton University (UK) Inter. Atm - OCCAM lite • UCL (Belgium) LMDz - CLIO • SMHI (Sweden) ECHAM - RCA RCA(region.) – RCO(region.) • U. of Bergen (Norway) MM5 - ROMS • KNMI (Netherlands) ECHAM5 - MPI-OM • DMI (Danemark) ECHAM - HIRLAM • INGV (Italy) ECHAM5 – MPI-OM • IRI (USA) ECHAM4 - MOM3 • JAMSTEC (Japan) ECHAM4 - OPA 8.2 • BMRC (Australia) BAM - MOM4 BAM3 - ACOM2 • U. of Tasmania (Australia) Data Atm. - MOM4 • CAS,IIT Delhi (India) MM5 - POM

  12. OASIS community • CERFACS (France) ARPEGE3 - ORCA2LIM ARPEGE3 - OPA 8.1 ARPEGE3 - OPAICE • METEO-FRANCE (France)ARPEGE4 - ORCA2 ARPEGE medias -OPAmed ARPEGE3 - OPA8.1 ARPEGE2 - OPA TDH • IPSL- LODYC, LMD, LSCE (France) LMDz - ORCA2LIM LMDz - ORCA4 LMDz - OPA ATL3/ATL1 IFS - OPA 8.1 ECHAM4 - ORCA2 • MERCATOR (France) PAM(OPA) • MPI - M&D (Germany) ECHAM5 - MPI-OM ECHAM5 - C-HOPE PUMA - C-HOPE EMAD - E-HOPE ECHAM5 - E-HOPE ECHAM4 - E-HOPE • ECMWF(UK)IFS Cy23r4 - E-HOPE IFS Cy15r8 - E-HOPE

  13. Mono-process coupler • 2D scalar coupling fields interpolation (SCRIP1.4) • PRISM System Model Interface Library PSMILe • coupling fields exchange (MPI1 & MPI2) • I/O actions (GFDL mpp_io) O A O A Oasis3 O A O A file A A Oasis2 and Oasis3 • Flexibility, modularity: •  Coupler and PSMILe act according to user-defined coupling configuration (text file): • number of models and coupling fields • coupling frequencies and transformations for each field • I/O or coupling mode (transparent for model)

  14. Need to optimise and parallelise the coupler Oasis4 – new demands • Higher resolution, parallel and scalable models • Higher coupling frequencies desirable • Higher number of models and (3D) coupling fields • Massively parallel platforms

  15. OASIS4 is composed of: • a Driver • a Transformer • a new PRISM System Model Interface Library

  16. Interface and data flow

  17. OB C C C C C C O2 OB T C OB O2 OB O1 C OB C O1 OB C OASIS4 • MPI parallel communication including repartitioning • parallel multigrid 3D neighbourhood search and calculation of communication patterns in each source process PSMILe • extraction of useful part of source field only • parallel I/O: single file, distributed files: GFDL mpp_io parallel file:parNetCDF • parallel Transformer: loops over PSMILe requests • flexibility and modularity same as Oasis3

  18. ATM-LAND AD Driver ATM PMIOD V1: in, metadata V1 V2: out, metadata V2 V3: out, metadata V3 user user OCE AD user OCE PMIOD V1: out, metadata V1 V2: in, metadata V2 user • SCC • ATM:... • OCE:... • LAND:... OCE OCE SMIOC V1 : to ATM, T1 to file V1 V2 : from ATM, T2 ATM ATM SMIOC V1 : from OCE, T1 V2: to OCE, T2 V3 : to LAND T LAND V2 V2 V3 V1 V1 LAND PMIOD V3: in, metadata V3 V4: in, metadata V4 user V4 fileV1 fileV4 Definition Phase LAND SMIOC V3 : from ATM V4 : from fileV4 Composition Phase Deployment Phase

  19. Oasis – Current status • OASIS3_prism_2-2 available • OASIS4 prototype available • OASIS4 final PRISM version due 12/2004

  20. File formats and grids • NetCDF for grid and restart auxiliary files • CF convention under development, extending the COARDS conventions • XML for model and script meta-data input (Fortran namelist and shell replacement)

  21. System Architecture and User Interface PRISM architecture to provide an efficient climate modelling infrastructure to users and developers through: • Standardised interfaces • Remote functionality • Centralised administration • Distributed resources

  22. Standard compile (SCE) and run environment (SRE) • Finalizing the SCE and SRE for the PRISM models • The system comprises 15 models (arpege_climat4 echam5 hamocc lim lmdz mozart mpi-om oasis3 opa orchidee pisces toy4opa toyatm toyche toyoce) which are adapted to a varying degree to the PRISM standards and can run in several combinations. • Most of the models have been tested on a variety of platforms (NEC (SX), SGI (MIPS or IA64SGI), FUJITSU (VPP), IBM (power4)) A tedious task without flashy graphics but very useful !!!

  23. PrepIFS

  24. PrepIFS – earth system modeling via the Internet …

  25. OASIS4 – GUI support XML designed to be read by machines not humans !!!

  26. SMS/WebCdp job scheduling and monitoring • Complex automated scheduling • Macro-parallelism • Flexible inter-dependencies • Interactive control • Visual structure of large systems • Used for all operational and research activities at ECMWF ( ~10 years)

  27. Further tools … • Diagnostics – Prism processing and visualization software (COCO, CDAT/VCS and VTK), at ECMWF MARS/Vis5D/Metview • Web GUI – database and diagnostics web interface using web-access server technology, (DODS and LAS), at ECMWF Web-MARS

  28. A PRISM sustained team • Document by PRISM Steering Group proposing establishment of a PRISM sustained team of 7 people sent to European Climate modelling Community (June 04) • First preparation meeting: August 17th, 2004 • Target: Signature of Consortium Agreement: 01/2005 • MPI (Germany), CERFACS (France), ECMWF (EU), CNRS (France), MetOffice (UK), NCAS(UK), CCRLE (Germany) already expressed strong interest. • Additional FP6 funding (March 2005)? Enthusiasm is still high !!!

  29. Further information http://prism.enes.org

  30. Oasis4: in the prototype (05/2004) • Access and use of XML information • Coupling and I/O of n parallel applications with m components. • Coupling exchange with repartitioning, direct or through the Transformer • Interpolations: • PSMILe (non-exact) parallel neighbourhood search • 3D 2D nearest-neighbour, 3D linear, 2D linear • I/O: • single and parallel mode • Coupling and I/O exchange from one source to many targets • Local transformations (scatter, gather, add or mult scalar, statistics) • Basic time transformation (average, accumul, min, max)

  31. Oasis4: still to be done • PSMILe API for model access to SCC and SMIOC information • Interpolation: • More schemes (conservative, 3D, etc.) • Exact parallel neighbour search • Transformer parallelisation (almost completed) • Field reduction, combination • Full support of vector and bundle fields (I/O OK) • I/O: distributed mode (parNetCDF) • Adaptive grids • Unstructured grids

More Related