1 / 37

Status Review of NOAA Implementation of ESMF Architecture (i.e. NEMS) Mark Iredell May 2008

Status Review of NOAA Implementation of ESMF Architecture (i.e. NEMS) Mark Iredell May 2008. NEMS People NEMS Projects NEMS Issues. Tom Black Atm , NAM Huiya Chuang Post Ed Colon Infrastructure Mike Ek Land Jim Geiger Land (NASA) Bob Grumbine Sea ice Henry Juang Atm, Dyn , GFS

Download Presentation

Status Review of NOAA Implementation of ESMF Architecture (i.e. NEMS) Mark Iredell May 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Status Review of NOAA Implementation of ESMF Architecture (i.e. NEMS) Mark IredellMay 2008 • NEMS People • NEMS Projects • NEMS Issues

  2. Tom Black Atm, NAM Huiya Chuang Post Ed Colon Infrastructure Mike Ek Land Jim Geiger Land (NASA) Bob Grumbine Sea ice Henry Juang Atm, Dyn, GFS Young Kwon Hurricane Sujay Kumar Land (NASA) Sarah Lu AQ Avichal Mehra Ocean S. Moorthi Atm, Phy, GFS Ken Mitchell Land Christa Peters Land (NASA) Youhua Tang AQ Hendrik Tolman Wave Ratko Vasic Infrastructure Jun Wang Infrastructure Xingren Wu Ice, Coupling Weiyu Yang Infrastructure Mike Young Infrastructure Shujia Zhou Land (NASA) Yanqiu Zhu GSI NEMS People Biweekly UMIG (Unified Modeling Infrastructure Group) meeting

  3. V. Balaji GFDL Brian Gross GFDL Niki Zadeh GFDL Stan Benjamin GSD Tom Henderson GSD Jin Lee GSD Jacques Middlecoff GSD NEMS People Monthly UMIT (Unified Modeling Infrastructure Telecon) meeting Includes local UMIG group plus… Occasionally the telecon invites a much wider group, including DTC and Navy.

  4. NEMS Projects • ESMF • NEMS atmosphere • Write history and Post processor • Nesting • Aerosols and Chemistry • Land • Ocean, waves and sea ice • Ionosphere • Ensemble • Data assimilation

  5. ESMF • Earth System Modeling Framework • Community effort, partially supported by NOAA • Proper ESMF Superstructure required for all NEMS components • ESMF Infrastructure optional • NEMS will require ESMF 3.1.0r at this time

  6. An ESMF “Component” Data flow in an ESMF Component Run method “Parent” Driver (User Code) Import State Export State ESMF library code Run() SetServices() Run() Initialize() Finalize() Internal State An ESMF Component

  7. ESMF Component rules • Only SetServices is public; Initialize, Run, and Finalize are private but exposed by SetServices. • Import State is input; Export State is output. Both should be fully described using ESMF metadata (names, grid, decomposition, etc). • Coupler Component likely has Import and Export States on different grids. A Coupler Component is user code. • Internal State is private to the component but persists across calls to Initialize, Run, and Finalize. Even other instances of the same component will have their own private internal state. An access exception may be made to other “friendly” components.

  8. ESMF Infrastructure capabilities ESMF offers or will offer capabilities for decomposed halo update and transposes as well as extensive parallel regridding tools (among other capabilities). The user is free to make use of these or not. Thus far NEMS has taken advantage of existing in-house capability and has minimally made use these ESMF capabilities. This is expected in such transition projects, and ESMF capability will be used as needed.

  9. ESMF Portability, Conventions • ESMF is designed to run on most systems Earth modeling runs on. Plus, ESMF infrastructure will support some ordinary functions that otherwise often hinder portability. • ESMF supports but does not require the Climate-Forecast metadata conventions.Should NEMS adopt them? • Pro: wide community uses it, including NOAA labs • Con: very long names, but we can make local aliases

  10. CF convention examples fromhttp://cf-pcmdi.llnl.gov/documents/cf-standard-names/ncep-grib-code-cf-standard-name-mapping

  11. NEMS Component Library • Create a NEMS component library (in SVN). • Each component will have some documentation. Moreover, each component should have a sample MAIN to run it in stand-alone mode for testing. For instance, Dynamics components may have Held-Suarez, Physics may have a single-column model. • Each type of component will have an assigned NCEP librarian to coordinate the sub-library.

  12. NEMS Atmosphere Color Key Component class Atmosphere Coupler class unified atmosphere including digital filter Completed Instance Under Development Future Development Dynamics Physics Dyn-Phy Coupler ARW NMM-B NAM Phy GFDL Phy Simple FVCS Spectral GFS Phy Regrid, Redist, Chgvar, Avg, etc NOGAPS FIM WRF Phy COAMPS FISL Navy Phy • The goal is one unified atmospheric component that can invoke multiple dynamics and physics. • At this time, dynamics and physics run on the same grid in the same decomposition, so the coupler literally is very simple.

  13. Original FIM run() Call Order • FIM run: • do timestep = first, last • dyn_1 ! 1st half of dyn calls • call physics() • dyn_2 ! 2nd half of dyn calls • end do • Must re-order operations without changing model results Slide courtesy Stan Benjamin

  14. FIM run() Call Re-Ordering • Re-order time stepping loop: • do timestep = first, last+1 • if (timestep > first) & • dyn_2(timestep-1) • if (timestep <= last) then • dyn_1(timestep) • call physics(timestep) • endif • end do Slide courtesy Stan Benjamin

  15. New FIM run() Call Order • Combine dyn_1 + dyn_2 into dyn_run and push “if” statements inside • Push physics() and if statement into phy_run • do timestep = first, last+1 • call dyn_run() • call phy_run() • end do MI: Note NMM and GFS have gone through the exact same process. Slide courtesy Stan Benjamin

  16. Write history and Post processor Color Key Component class Coupler class Atmosphere Completed Instance unified atmosphere Under Development Future Development Dynamics Physics Write Write Write Binary file NEMSIO Post • The Write component transfers state from model tasks to “quilt” tasks and then writes out from there. • NEMSIO is intended to be a unified optimized parallelized I/O package that can write several formats. Ideally, NEMSIO would be used for all NEMS output data and metadata (not necessarily restart data). • The unified post-processor NCEP_POST will run on the quilt tasks. • Perhaps in the future, the Dynamics and Physics will write their own history files.

  17. Atmosphere Atmosphere Nesting Color Key Component class Atmosphere Coupler class Completed Instance Atmosphere Atmosphere Under Development Future Development 1-way conc. NAM-NAM 1-way conc. GFS-NAM Atmosphere Atmosphere Atmosphere 2-way seq. 2-way conc. moving nest • Parent creates children and creates proper boundary conditions in their import states. • Recursively, children create grandchildren. • Children run on different tasks from parent in concurrent nesting, same tasks in sequential nesting. • Two-way concurrent nesting would require a different time integration scheme that needs to be tested.

  18. NEMS Nesting •Eventually we will have all forms. ► One-way / Two-way. ► Static / Moving. ► Grid-associated / Not grid-associated. •Begin with 1-way grid-associated static nests. ► The parent domain can have any number of children. ► Telescoping: Children can have any number of children. ► Domains can run concurrently on unique sets of processors. These three criteria require the need for general and repeated splitting of the MPI Communicators.--Done Slide courtesy Tom Black

  19. Nesting – ATM_RUN ATM_RUN(Fcst_time) DO Timeloop over Fcst_timeat interval Δtinternal IF(My_Compute_Tasks) PREDICT(Δtinternal) IF N_CHILDREN>0 DO N= 1, N_CHILDREN IF(My_Compute_Tasks) ISend BC’s to child IF(Child_Compute_Tasks(N)) Recv from parent CALL ATM_RUN(Child(N), Fcst_time=Δtinternal) ENDDO IF(Two-way) Send children’s state back to me and blend into mine ENDIF ENDDO END ATM_RUN “Mini” timeloop Recursive call Slide courtesy Tom Black

  20. Aerosols and Chemistry Color Key Component class Atmosphere Coupler class Completed Instance Dynamics Physics Aerosols Under Development Future Development GOCART CBM-5 reduced chem WRF-chem • GOCART is Goddard aerosol model. CBM is VOC-NOX-O3 mechanism (chemistry in CMAQ model). • Aerosol and chemistry components compute sources, sinks, and chemical transformation. • Aerosol and chemistry components may need to expose internal state to avoid memory copies. • Atmosphere must be ready to do convective fluxes and removal as well as advection and diffusion.

  21. Land (alternative 0) Color Key Component class Coupler class Atmosphere Completed Instance Under Development Dynamics Physics Land Future Development LIS-Noah others • Land is responsible for computing surface fluxes. • Land may need to be invoked inside physics, since it needs radiation and surface layer needs it. • Implicit solving would require iteration between Land and PBL/Moist. • How deep should ESMF go?

  22. Land (alternative 0b) Main Color Key Component class Coupler class Atmosphere Land Completed Instance LIS-Noah Under Development Dynamics Physics Future Development others • Land is responsible for computing surface fluxes. • Land may need to be invoked inside physics, since it needs radiation and surface layer needs it. • Implicit solving would require iteration between Land and PBL/Moist. • How deep should ESMF go?

  23. Atmosphere Dynamics Radiation Land PBL/Moist LIS-Noah others Land (alternative 1) Color Key Component class Coupler class Completed Instance Under Development Future Development • Land is responsible for computing surface fluxes. • Land may need to be invoked inside physics, since it needs radiation and surface layer needs it. • Implicit solving would require iteration between Land and PBL/Moist. • How deep should ESMF go?

  24. Land (alternative 1b) Color Key Component class Coupler class Atmosphere Completed Instance Under Development Dynamics Land PBL/Moist Radiation Future Development LIS-Noah others • Land is responsible for computing surface fluxes. • Land may need to be invoked inside physics, since it needs radiation and surface layer needs it. • Implicit solving would require iteration between Land and PBL/Moist. • How deep should ESMF go?

  25. Land (alternative 2) Color Key Component class Coupler class Atmosphere Completed Instance Under Development Dynamics Physics Future Development Radiation Land PBL/Moist Aerosols/Chemistry • Land is responsible for computing surface fluxes. • Land may need to be invoked inside physics, since it needs radiation and surface layer needs it. • Implicit solving would require iteration between Land and PBL/Moist. • How deep should ESMF go?

  26. Land (alternative 3) Color Key Component class Coupler class Atmosphere Completed Instance Under Development Dynamics Physics Future Development Radiation Surface PBL/Moist Aerosols/Chemistry Land Ice Waves Ocean • Land is responsible for computing surface fluxes. • Land may need to be invoked inside physics, since it needs radiation and surface layer needs it. • Implicit solving would require iteration between Land and PBL/Moist. • How deep should ESMF go?

  27. GEOS-5 AGCM COMPONENT STRUCTURE CAP HISTORY AGCM COLUMN PROCESSES OGCM (qv, ql, qi, cl,…) MOIST RADIATION CHEM SOLAR TURB IR (O3,…,Qdust,…) DYNAMICS SURFACE FVCORE GWD LAKE LAND ICE (u, v, T, p) (Ts,Fi...) (Ts,Fi...) CATCH VEGDYN SALTWATER (Tc, qc, Td,...) (Ci,...) (Tskin,Hskin,...) Slide courtesy Max Suarez

  28. Ocean, waves and sea ice Color Key Earth system model Component class Coupler class Completed Instance Atm-Ocn Coupler Atmosphere Waves Ice Ocean Under Development Future Development GFS WW III H-ice MOM4 NEMS-GFS N-ice HYCOM NCEP CFS non-ESMF MPMD Atm-Ocn Coupler GFS MOM4 • Under ESMF NEMS, there should be not much difference between SPMD and MPMD codes. However, the MPMD code might not be as portable. MPMD will allow more flexible optimizations though. • Due to some rapidly interacting physics (sea ice thermodynamics and wave roughness), some coupling would have to be frequent.

  29. Atm-Ocn Coupler GFS MOM4 Ocean, waves and sea ice (2) Color Key Earth system model Component class Coupler class Completed Instance Atm-Sfc Coupler (Fast/Slow) Surface (Fast/Slow) Atmosphere Under Development Future Development GFS Land NEMS-GFS Waves Ice Ocean NCEP CFS non-ESMF MPMD WW III H-ice MOM4 N-ice HYCOM • Under ESMF NEMS, there should be not much difference between SPMD and MPMD codes. However, the MPMD code might not be as portable. MPMD will allow more flexible optimizations though. • Due to some rapidly interacting physics (sea ice thermodynamics and wave roughness), some coupling would have to be frequent and possibly iterative. The fast land, ice, and waves components may need to run serially on the atmospheric component’s processors.

  30. Ocean, waves and sea ice (3) Color Key Earth system model Component class Coupler class Completed Instance Atmosphere Atm-Sfc Coupler (Slow) Surface (Slow) Under Development Future Development Dynamics Physics Etc. Rad. Waves Ice Ocean Land Surface (Fast) Turbulence Waves Ice Ocean Land • Due to some rapidly interacting physics, some coupling would have to be frequent and possibly iterative. The fast mode components may need to run serially on the atmospheric component’s processors. • The fast and slow “modes” of model components may have different import and output requirements, and notably different VMs, but may share internal states.

  31. current CFS load-balance scenarioassuming 3 fast timesteps for every 1 slow timestep processor time

  32. alternate CFS load-balance scenarioassuming 3 fast timesteps for every 1 slow timestep processor time

  33. alternate Earth system model load-balance scenarioassuming 3 fast timesteps for every 1 slow timestep processor time

  34. Ionosphere Color Key Component class Whole atmosphere Coupler class Completed Instance Atm-Ion Coupler Atmosphere Ionosphere Under Development Future Development GFS-IDEA GIP TBD • GFS-IDEA is the GFS extended up to 600 km (from 60 km). GIP is Global Ionosphere-Plasmasphere • The coupling will be frequent and three-dimensional, providing optimization challenges.

  35. Ensemble Color Key Component class Ensemble Coupler class Completed Instance Atm ens Coupler Atmosphere Atmosphere Atmosphere Under Development Future Development GFS NEMS-GFS NEMS-any • The atmosphere ensemble coupler provides stochastic forcing for the ensemble. • The current non-NEMS GEFS capability will be converted to NEMS.

  36. Data assimilation Color Key Data assimilation Component class Coupler class Mdl-Anl Coupler Model Analysis Completed Instance Under Development GFS GSI Future Development NAM Whole Earth System Nested Ensemble • Frequent coupling of atmosphere and analysis will be necessary as hourly analyses are required. • Whole Earth System Nested Ensemble may include any components in NEMS from previous slides. • 4DVAR is supported in this scenario. The inner loop of the variational system could be entirely within the analysis component, or it could invoke adjoint methods with the NEMS atmosphere. • The model-analysis coupler may change grids and variables. When returning to the model state, only the analysis increments will be interpolated.

  37. NEMS Issues • ESMF version • NEMS will require ESMF 3.1.0r, which will be the latest public release in spring 2008. NEMS may later need ESMF 3.1.1. • All components must use the same version • MAPL • GSFC ESMF wrapper • Used by MOM4, GOCART, FVCORE • Compatible with NEMS? • Code Repositories • Operationally secure • Collaboration friendly • Experiment Launcher compatible with operations • Documentation and Support

More Related