180 likes | 188 Views
Join the ESMF Workshop in Toulouse, France to learn about the latest coupling technologies for Earth System Modelling. Explore the evolution of the ESMF framework and its components, and discover the advancements in data representation, coupling options, and metadata handling. Don't miss this opportunity to collaborate with experts and contribute to the development of Earth System Modeling.
E N D
Earth System Modeling FrameworkWorkshop on “Coupling Technologies for Earth System Modelling : Today and Tomorrow”CERFACS, Toulouse (France) – Dec 15th to 17th 2010 Ryan O’KuinghttonsRobert OehmkeCecelia DeLuca
Motivation In climate research and numerical weather prediction.. increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall modeling system In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable computing architectures In software …emergence of frameworks to promote code reuse and interoperability The ESMF is a focused community effort to tame the complexity of models and the computing environment. It leverages, unifies and extends existing software frameworks, creating new opportunities for scientific contribution and collaboration.
Evolution Phase 1: 2002-2005 NASA’s Earth Science Technology Office ran a solicitation to develop an Earth System Modeling Framework (ESMF). A multi-agency collaboration (NASA/NSF/DOE/NOAA) won the award. The core development team was located at NCAR. A prototype ESMF software package (version 2r) demonstrated feasibility. Phase 2: 2005-2010 New sponsors included Department of Defense and NOAA. Many new applications and requirements were brought into the project, motivating a complete redesign of framework data structures (version 3r). Phase 3: 2010-2015 The core development team moved to NOAA/CIRES for closer alignment with federal models. Basic framework development will be complete with version 5r (ports, bugs, feature requests, interoperability templates, user support etc. still require resources). The focus is on increasing adoption and creating a community of interoperable codes.
Components • ESMF is based on the idea of components – functionally distinct sections of code that are wrapped in standard interfaces • Components may represent either a physical domain or a function • Components can be arranged hierarchically, helping to organize the structure of complex models ESMF components in the GEOS-5 atmospheric GCM
Architecture • ESMF provides a superstructure for assembling geophysical components into applications. • ESMF provides an infrastructure that modelers use to • Generate and apply interpolation weights • Handle metadata, time management, data I/O and communications, and other functions • Access third party libraries Components Layer Gridded Components Coupler Components ESMF Superstructure User Code Model Layer ESMF Infrastructure Fields and Grids Layer Low Level Utilities External Libraries MPI, NetCDF, …
Standard Interfaces All ESMF components have the same three standard methods: Initialize Run Finalize Each standard method has the same simple interface: • Steps to adopting ESMF • Divide the application into components (without ESMF) • Copy or reference component input and output data into ESMF data structures • Register components with ESMF • Set up ESMF couplers for data exchange call ESMF_GridCompRun (myComp, importState, exportState, clock, …) Where: myComp points to the component importState is a structure containing input fields exportState is a structure containing output fields clock contains timestepping information • Interfaces are wrappers and can often be setup in a non-intrusive way
Data Representation Options 1. Representation in index space (Arrays) • One or more tiles store indices and topology • Sparse matrix multiply for remapping with user supplied interpolation weights • Highly scalable - no global information held locally, uses distributed directory approach (Devine 2002) for access to randomly distributed objects in an efficient, scalable way 2. Representation in physical space (Fields) • Built on Arrays + some form of Grid • Grids may be logically rectangular, unstructured mesh, or observational data • Remapping using parallel interpolation weight generation- bilinear, higher order, or first order conservative Supported Array distributions
Coupling Options Multiple executable options • Lots of flexibility incoupling approaches • Single executable • Multiple executable • Array send/rcv with InterCommpackage (PVM) • Web service option • Other options • Couplingcommunicationscan be called either from within a coupler or directly from a gridded component – useful when it is inconvenient to return from a component in order to perform a coupling operation • Recursive components for nesting higher resolution regions • Ensemble management with either concurrent or sequential execution of ensemble members Coupler Contributed by U Maryland Comp B Comp A Comp A Comp B Array send/recv
Metadata Handling and Usage • Metadata is broken down into name/value pairs by the Attribute class • Can be attached at any level of the ESMF object hierarchy • Document data provenance to encourage self describing models • Automate some aspects of model execution and coupling • Actively exploring in this direction with workflows and web services • Standard metadata is organized by Attribute packages • Used to aggregate, store, and output model metadata • Can be nested, distributed, and expanded to suite specific needs • Designed around accepted metadata standards • Emerging conventions • Climate and Forecast (CF) • ISO standards • METAFOR Common Information Model (CIM)
Summary of Features • Fast parallel remapping: unstructured or logically rectangular grids, 2D and 3D, using bilinear, higher order, or conservative methods, integrated (during runtime) or offline (from files) • Multiple strategies for support of nested grids • Core methods are scalable to tens of thousands of processors • Supports hybrid (threaded/distributed) programming for optimal performance on many computer architectures • Multiple coupling and execution modes for flexibility • Time management utility with many calendars (Gregorian, 360-day, no-leap, Julian day, etc.), forward/reverse time operations, alarms, and other features • Metadata utility supports emerging standards in flexible and useful way • Runs on 25+ platform/compiler combinations, exhaustive test suite and documentation • Couples between Fortran and/or C-based model components
Class Structure GridComp Land, ocean, atm, … model CplComp Xfers between GridComps State Data imported or exported Superstructure Infrastructure FieldBundle Collection of fields Field Physical field, e.g. pressure Grid, LocStream, Mesh (C++) LogRect, Unstruct, etc. Xgrid Exchange grid DistGrid Grid decomposition F90 Array Hybrid F90/C++ arrays C++ Regrid Computes interp weights DELayout Communications Route Stores comm paths Utilities Machine, TimeMgr, LogErr, I/O, Config, Attributes etc. Communications Data
Component Overhead • Representation of the overhead for ESMF wrapped native CCSM4 component • For this example, ESMF wrapping required NO code changes to scientific modules • No significant performance overhead (< 3% is typical) • Few code changes for codes that are modular • Platform: IBM Power 575, bluefire, at NCAR • Model: Community Climate System Model (CCSM) • Versions: CCSM_4_0_0_beta42 and ESMF_5_0_0_beta_snapshot_01 • Resolution: 1.25 degree x 0.9 degree global grid with 17 vertical levels for both the atmospheric and land model, i.e. 288x192x17 grid. The data resolution for the ocean model is 320x384x60.
Remapping Performance • All ESMF interpolation weights are generated with unstructured mesh • Increases flexibility with 2D and 3D grids • Adds overhead to bilinear interpolation • Greatly improves performance over existing conservative methods • ESMF parallel conservative remapping is scalable and accurate • Bilinear could use additional optimization • Platform: Cray XT4, jaguar, at ORNL • Versions: ESMF_5_2_0_beta_snapshot_07 • Resolution: • - fv0.47x0.63: CAM Finite Volume grid, 576x384 • - ne60np4: 0.5 degree cubed sphere grid, 180x180x6
A Common Model Architecture A Common Model Architecture • The US Earth system modeling community is converging on a common modeling architecture • Atmosphere, ocean, sea ice, land, wave, and other models are ESMF or ESMF-like components called by a top-level driver or coupler • Many models are componentizing further ESMF-enabled systems include: Navy Coupled Ocean Atmosphere Mesoscale Prediction System / Wavewatch III Navy Operational Global Atmospheric Prediction System Hybrid Coordinate Ocean Model – CICE Sea Ice • Features and Benefits: • Interoperability promotes code reuse and cross-agency collaboration • Portable, fast, fully featured toolkits enhance capability • Automatic compliance checking for ease of adoption NASA GEOS-5 Atmospheric General Circulation Model NOAA National Environmental Modeling System NOAA GFDL MOM4 Ocean NCAR Community Earth System Model Weather Research and Forecast Model HAF Kinematic Solar Wind-GAIM Ionosphere pWASH123 Watershed-ADCIRC Storm Surge Model
GFS GFS Atm Phys GFS Atm Dynamics GFS I/O NEMS NMM-B Atm Dynamics NMM-B Atm Phys NMM History GEOS-5 Radiation GEOS-5 LW Rad GEOS-5 Solar Rad GEOS-5 Land GEOS-5 Veg Dyn GEOS-5 Catchment GEOS-5 OGCM Poseidon GEOS-5 Data Ocean GEOS-5 Salt Water GEOS-5 Ocean Biogeo ESMF Model Map 2010 Legend Ovals show ESMF components and models that are at the working prototype level orbeyond. A Common Model Architecture NOAA Department of Defense University NASA Department of Energy National Science Foundation ESMF coupling complete Component (thin lines) Model (thick lines) CCSM4/CESM POP Ocean CICE Ice CLM Land CAM Atm FIM • Increasingly, models in the U.S. follow a common architecture • Atmosphere, ocean, sea ice, land, and/or wave models are components called by a top-level driver/coupler • Components use ESMF or ESMF-like interfaces (see left) • Many major U.S. weather and climate models either follow this architecture (CCSM/CESM, COAMPS, NEMS), want to follow this architecture for future coupled systems (NOGAPS), or have a different style of driver but could provide components to this architecture (GEOS-5, FMS) HYCOM NOGAPS Strat Chem Param Chem GEOS-5 WRF GEOS-5 Atm Dynamics GOCART GEOS-5 GWD GEOS-5 FV Dycore FV Cub Sph Dycore Tracer Advection Land Info System GSI GEOS-5 Atm Physics GEOS-5 Hiistory GEOS-5 Atm Chem GEOS-5 Aeros Chem GEOS-5 Surface Even non-ESMF codes now look like ESMF … ESMF: ESMF_GridCompRun(gridcomp, importState, exportState, clock, phase, blockingFlag, rc) CESM (non-ESMF version): atm_run_mct(clock, gridcomp, importState, exportState) (argument names changed to show equivalence) GEOS-5 Topology GEOS-5 Moist Proc GEOS-5 Lake GEOS-5 Turbulence GEOS-5 Land Ice HAF GAIM MOM4 WWIII COAMPS SWAN pWASH123 ADCIRC NCOM
NUOPC Layer: Goals • National Unified Operational Prediction Capability is a consortium of operation weather prediction centers • Standardize implementation of ESMF across NASA, NOAA, Navy, and other model applications • Demonstrate improved level of interoperability • Specific goals described in the NUOPC Common Model Architecture Committee report • CMA report - http://www.weather.gov/nuopc/CMA_Final_Report_1%20Oct%2009_baseline.pdf? • NUOPC website - http://www.weather.gov/nuopc
NUOPC Layer: Products • Software templates to guide development of a common architecture for components and couplers • A software layer to narrow the scope of ESMF interfaces • NUOPC Compliance Checker software (initial implementation available with ESMF_5_1_0) • Comprehensive tutorial materials • Websites, repositories, trackers, and other collaborative tools • NUOPC Layer Guidance Documents (posted on ESMF website) • ESMF - www.earthsystemmodeling.org
Other ESMF related projects … • Earth System Curator (sponsors NSF/NASA/NOAA) • Implementation of the METAFOR Common Information Model in the Earth System Grid (ESG) portal for the 5th Coupled Model Intercomparison Project • Using ESMF Attributes to generate the METAFOR CIM schema directly from models • Atmosphere/hydrological model coupling using OpenMI and ESMF web services • Earth Science Gateway on the TeraGrid (sponsor NSF) • End-to-end self-documented workflows from web-based model configuration to data archival, with Purdue and NCAR • Global Interoperability Program (sponsor NOAA) • Support for projects involved with interoperability, infrastructure development, and global modeling education