410 likes | 634 Views
Weather Research and Forecast Model: Software Development and Performance. John Michalakes Mesoscale and Microscale Meteorology National Center for Atmospheric Research michalak@ucar.edu Workshop on Air Quality Forecasting in Latin American Cities, 10 January 2006. Outline. WRF Overview
E N D
Weather Research and Forecast Model:Software Development and Performance John Michalakes Mesoscale and Microscale Meteorology National Center for Atmospheric Research michalak@ucar.edu Workshop on Air Quality Forecasting in Latin American Cities, 10 January 2006
Outline • WRF Overview • Applications • Software
Weather Research and Forecast Model Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations • Large collaborative effort to develop community model with direct path to operations • Advanced numerics, data assimilation, and model physics • Advanced Software Architecture • Modular, flexible, extensible • Portable and efficient • Support broad range of applications, scales: • Large Eddy Simulation • Cloud modeling, storm simulation • Synoptic-scale research • Real-time NWP and Hurricane prediction • Air quality research and prediction • Regional climate
Weather Research and Forecast Model Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations • Large collaborative effort to develop community model with direct path to operations • Advanced numerics, data assimilation, and model physics • Advanced Software Architecture • Modular, flexible, extensible • Portable and efficient • Support broad range of applications, scales: • Large Eddy Simulation • Cloud modeling, storm simulation • Synoptic-scale research • Real-time NWP and Hurricane prediction • Air quality research and prediction • Regional climate
WRF Development Teams and Working Groups Weather Research and Forecast Model Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations • Large collaborative effort to develop community model with direct path to operations • Advanced numerics, data assimilation, and model physics • Advanced Software Architecture • Modular, flexible, extensible • Portable and efficient • Support broad range of applications, scales: • Large Eddy Simulation • Cloud modeling, storm simulation • Synoptic-scale research • Real-time NWP and Hurricane prediction • Air quality research and prediction • Regional climate
PBL LES, Dx = 50 m (Chin-Hoh Moeng) Weather Research and Forecast Model Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations • Large collaborative effort to develop community model with direct path to operations • Advanced numerics, data assimilation, and model physics • Advanced Software Architecture • Modular, flexible, extensible • Portable and efficient • Support broad range of applications, scales: • Large Eddy Simulation • Cloud modeling, storm simulation • Synoptic-scale research • Real-time NWP and Hurricane prediction • Air quality research and prediction • Regional climate
Supercell thunderstorm, Dx = 1 km Weather Research and Forecast Model Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations • Large collaborative effort to develop community model with direct path to operations • Advanced numerics, data assimilation, and model physics • Advanced Software Architecture • Modular, flexible, extensible • Portable and efficient • Support broad range of applications, scales: • Large Eddy Simulation • Cloud modeling, storm simulation • Synoptic-scale research • Real-time NWP and Hurricane prediction • Air quality research and prediction • Regional climate
Weather Research and Forecast Model Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations • Large collaborative effort to develop community model with direct path to operations • Advanced numerics, data assimilation, and model physics • Advanced Software Architecture • Modular, flexible, extensible • Portable and efficient • Support broad range of applications, scales: • Large Eddy Simulation • Cloud modeling, storm simulation • Synoptic-scale research • Real-time NWP and Hurricane prediction • Air quality research and prediction • Regional climate Hurricane Isabel
Baroclinic wave, Dx = 100 km Weather Research and Forecast Model Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations • Large collaborative effort to develop community model with direct path to operations • Advanced numerics, data assimilation, and model physics • Advanced Software Architecture • Modular, flexible, extensible • Portable and efficient • Support broad range of applications, scales: • Large Eddy Simulation • Cloud modeling, storm simulation • Synoptic-scale research • Real-time NWP and Hurricane prediction • Air quality research and prediction • Regional climate
Weather Research and Forecast Model Goals: Develop an advanced mesoscale forecast and assimilation system, and accelerate research advances into operations • Large collaborative effort to develop community model with direct path to operations • Advanced numerics, data assimilation, and model physics • Advanced Software Architecture • Modular, flexible, extensible • Portable and efficient • Support broad range of applications, scales: • Large Eddy Simulation • Cloud modeling, storm simulation • Synoptic-scale research • Real-time NWP and Hurricane prediction • Air quality research and prediction • Regional climate Courtesy Georg Grell, WRF WG11 27km 36 hour NO+NO2 forecast 29-31 January 2005 http://www-frd.fsl.noaa.gov/aq/wrf/
WRF Overview and Status • Current release WRFV2.1.1, November 2005 • Two dynamical cores, full physics, chemistry • Variational Data Assimilation (released) and Ensemble Kalman Filter (in development) • Rapid community growth • More than 3,000 registered users • June 2005 Users Workshop: 219 participants, 117 inst., 65 countries • 46 scientific papers: real-time NWP, atmos. chemistry, data assimilation, climate, wildfires, mesoscale processes • Operational capabilities implemented or planned • Air Force Weather Agency • National Centers for Environmental Prediction • KMA (Korea), IMD (India), CWB (Taiwan), IAF (Israel), WSI (U.S.)
WRF Standard Initialization (SI) WRF 3D-Var WRF-Model Postprocessing WRF Modeling System Components
WRF SI Function • Defined simulation domain area and nests • Produced WRF input for terrain, landuse, soil type etc. on the simulation domain (“static fields”) • De-grib and interpolate meteorological data (u, v, T, q, surface pressure, soil data, snow data, sea-surface temperature) to WRF model grid (horizontally and vertically)
WRF SI Function (continued) • Support WRF nesting • Three map projections: • Lambert conformal • Polar stereographic • Mercator • Graphical user interface for running SI
WRF 3DVAR 3D Variational data assimilation • Ingest obs into WRF input analysis from SI • Conventional surface and upper air, wind profiler • Remote sensing data: Cloud-track winds, ATOVS thickness, ground-based GPS TPW, SSM/I, SSM/T1, SSM/T2, SSM/I brightness temp, Quickscat ocean surface winds, radar radial vel. • Two background error covarience models • NCEP and UK/UCAR • Cycling mode available • Part of integrated WRF Var . . .
WRF Var System • WRF 3D and 4D Var systems being merged into a single conceptual and software framework (Barker & Huang, NCAR)
WRF Var System • WRF 3D and 4D Var systems being merged into a single conceptual and software framework (Barker & Huang, NCAR) Generated from WRF NL by TAFF
WRF Model • Key features • Multiple dynamical cores • Moving, feature-following two-way nesting • Advanced I/O and Model Coupling • Future developments • Global WRF • Software architecture and portable performance
WRF Model Dynamics Two dynamical cores • ARW (Eulerian-Mass) core (Skamarock, Klemp, Wicker): • Formulation • C-grid horizontal coordinate system • Terrain following mass-based vertical coordinate system • Flux-form prognostic equations in terms of conserved variables (pressure and temperature diagnosed from conserved variables) • Exact (to roundoff) conservation of mass, dry entropy, and scalars • Integration techniques • Two time-level, 3rd order Runge-Kutta split-explicit time integration • 5th order upwind advection (positive-definite option) • Combined, these (1) minimize implicit and explicit numerical filtering needed for stable and robust integrations, and (2) maximize efficiency (solution accuracy versus cost) • Non-hydrostatic-Meso Model (NMM) (Z. Janjic, T. Black, NCEP) • E-Grid (like Eta), terrain-following (sigma) vertical coordinate
Focus costly high-resolution computation over region/feature of interest WRF Nesting Features: Fully parallelized, efficient 5-8% overhead well within 15% target Two-way interacting Dynamically instantiable Telescoping to arbitrary depth Moving . . . Nesting
Hurricane Katrina Two-way interacting nest 400 x 301 x 35, dt = 72 sec 331 x 352 x 35, dt = 24 sec Run time: 5.5 hours on 128p IBM Power 4 (NCAR) http://wrf-model.org/plots/realtime_main.php
WRF Model I/O System • Multiple I/O streams • Allow I/O of various data sets at different user-specified intervals • One main history + five auxiliary output streams • One main input + five auxiliary input streams • WRF-Chem uses Auxiliary input stream 5 for Emissions • Restart, LBC, Dedicated stream for cycling 3DVAR • Streams can be directed to files or to other models depending on the format selected for the stream at run-time • Formats for I/O to Files • NetCDF, HDF, GRIB, Native Binary • Formats for I/O to other programs (Coupling) • Model Coupling Toolkit (MCT) • Model Coupling Environment Library (MCEL) • Earth System Modeling Framework (ESMF)
Scenario 25 Nov. 1999, high wind event Ferry boat accident Simulate wave heights and effects on diving conditions Four model concurrent coupling on IBM WRF atmosphere, 4 to 32 proc. ADCIRC ocean, 4 to 8 proc. SWAN wave, 4 to 8 proc. LSOM optics, 1 proc. Forecast time 15 mins on 16 proc. WRF Coupling overhead < 5% High Fidelity Simulation Of Littoral Environments
WRF run real-time for 2005 Hurricane season Track and landfall well forecast Less skill forecasting intensity Coupling with dynamic lower boundary HYCOM ocean model Wave model (planned) Hurricane Katrina demonstration Generate WRF to HYCOM one-way forcing data for two-week run-up period (finished) Configure HYCOM GOMd0.04; spin-up over pre-Katrina period forcing with WRF data (next) Conduct initial 2-way WRF/HYCOM Katrina simulation Analysis, refinement Computational performance Add third model (SWAN or WW3) n WRF processors MCEL client API WRF I/O API Server processors (shared memory) MCEL m HYCOM processors MCEL Cache MCEL client API WRF/HYCOM Coupling (MCEL*) WRF sends (from 4km nest): Surface winds & air temps Precipitation Radiation fluxes WRF Receives: SST Roughness length (later) HYCOM sends: SSTs HYCOM Receives: Winds Precipitation Radiation fluxes *Model Coupling Environment Library, M. Bettencourt, AFRL HYCOM image courtesy A. Wallcraft, NRL HYCOM/MCEL courtesy P. Fitzpatrick & N. Tran, MSU
Global WRF • Extend WRF to global simulation • Consider candidate grids • Develop prototypes, assess accuracy, efficiency Lat/Lon Icosohedral/Hexagonal Cubed-sphere Yin-yang
Global WRF • Mark Richardson, Cal Tech NASA/VPL et al. (*) • Allow non-conformal projections and user-specifiable planetary parameters • Separation of map scale factors into x and y directional components • Fourier polar filtering • Polar boundary conditions • Working with MMM • Parallelize, include as option in next community release of WRF • Adapt to terrestrial weather and climate for preliminary testing of high resolution global modeling 4-hour Global WRF Martian surface temps; Four MPI tasks NON-CONFORMAL PROJECTION, GLOBAL, AND PLANETARY VERSIONS OF WRF. Mark I. Richardson,, and Claire E. Newman (Cal Tech), Anthony D. Toigo (Kobe University). Proceedings of 2005 WRF Users Workshop, Boulder, Colorado, June, 2005. http://www.mmm.ucar.edu/wrf/users/workshops/WS2005/abstracts/Session7/1-Richardson.pdf
WRF Model • Key features • Multiple dynamical cores • Moving, feature-following two-way nesting • Advanced I/O and Model Coupling • Future developments • Global WRF • Software architecture and portable performance
Land Surface Processes Data Ingest Regional Climate Physics WRF MODELING SYSTEM Numerics Operational Implementation Community Support Atmospheric Chemistry Data Assimilation Analysis & Visualization Testing & Verification The WRF Development EffortEngineering View • WRF Development Teams comprise: • 11 working groups • 90 individuals • 33 institutions WG 2: Software Architecture Standards and Implementation
WRF Software • Software infrastructure for next-generation limited area NWP model for atmospheric research and operations • Requirements • Flexibility over a range of platforms, applications, users • Efficient and portable over range of computers deployed in community • Modular, flexible, extensible, and maintainable • Full-function, including nesting for mesh-refinement
Implementation of WRF Architecture Hierarchical organization Multiple dynamical cores Plug compatible physics Abstract interfaces (APIs) to external packages for I/O, model coupling, parallelism Performance-portable Top-level Control, Memory Management, Nesting, Parallelism, External APIs driver ARW solver NMM solver Physics Interfaces mediation Plug-compatible physics Plug-compatible physics Plug-compatible physics Plug-compatible physics model Plug-compatible physics WRF Software Framework Overview
Parallelism in WRF • Computing has followed Moore’s law since the 1960s but that is not enough for even modest sized NWP problems today • Single processor performance is order 109-10 • Required performance 1011-12 • Parallel computing: employ 10s, 100s, 1000s of processors to achieve required performance in aggregate • Shared-memory parallelism • Distributed-memory parallelism • Hybrid (distributed-memory clusters of SMPs) • WRF designed for parallelism at the outset
Single version of code for efficient execution on: Distributed-memory Shared-memory Clusters of SMPs Vector and microprocessors Multi-level Parallelism Logical domain 1 Patch, divided into multiple tiles Model domains are decomposed for parallelism on two-levels • Patch: section of model domain allocated to a distributed memory node • Tile: section of a patch allocated to a shared-memory processor within a node; this is also the scope of a model layer subroutine. • Distributed memory parallelism is over patches; shared memory parallelism is over tiles within patches Inter-processor communication
WRF Supported Platforms (*) dm-parallel not supported yet; (**) Experimental, not released
Benchmark Case WRF 2.0 Continental U.S. Large-scale baroclinic cyclone, 24 October 2001 12km resolution, dt=72s 29 Gflop/time step Computational benchmark Computational rate Scaling Platforms SGI IBM Cray Linux WRF Performance Benchmark GOES 8, 10.7m, 2345Z, CIRA
www.mmm.ucar.edu/wrf/WG2/bench Performance (v2.0.x)
www.mmm.ucar.edu/wrf/WG2/bench Performance (v2.0.x) Simulation Speed Gflop/sec
WRF Software for WRF-Chem • Enhanced support for four-dimensional tracers • Auxiliary input and output data streams • Memory and other code optimization for large numbers of chemical species (underway) • Enhanced/simplified Registry support • Separation of WRF dynamics into operators (planned) • Merge WRF-Chem into community repository
Additional Information • www.wrf-model.org Users: www.mmm.ucar.edu/wrf/users Software: www.mmm.ucar.edu/wrf/WG2/software_v2 • wrfhelp@ucar.edu • michalak@ucar.edu