170 likes | 187 Views
THE UBC Model-3/CMAQ AQ Forecast System. Luca Delle Monache & Roland Stull Weather Forecast Research Team. NW-AIRQUEST ANNUAL MEETING, Monday, October 6th, 2003. OUTLINE. UBC AQ Forecast System MC2 mesoscale model MCIP SMOKE CMAQ PAVE or GrADS? Monster Cluster
E N D
THE UBC Model-3/CMAQAQ Forecast System Luca Delle Monache & Roland Stull Weather Forecast Research Team NW-AIRQUEST ANNUAL MEETING, Monday, October 6th, 2003
OUTLINE • UBC AQ Forecast System • MC2 mesoscale model • MCIP • SMOKE • CMAQ • PAVE or GrADS? • Monster Cluster • Computational Domain and Forecast Period • Parallel Run Comparison • Future Work • Acknowledgements
Mesoscale Compressible Community Model (MC2) • Version 4.9.1 • Non-hydrostatic • Terrain following Gal-Chen coordinate system • Polar stereographic projection • Semi-implicit semi-Lagrangian scheme • PBL: turbulent kinetic energy (implicit vertical diffusion) • Surface layer scheme: similarity theory Note: MC2 output is converted into MM5 format (RWDI)
Meteorology Chemistry Interface Processor (MCIP) • Links MM5 with CMAQ to provide a complete set of meteorological data: • Version 2.1 • Physical and dynamical algorithms • Data format translation • Conversion of units of parameters • Diagnostic estimations of parameters not provided (PBL height, deposition parameters, cloud parameter) • Extraction of data for appropriate window domains • Reconstruction of meteorological data on different grid and layer structures • Mass consistency check
Sparse Matrix Operator Kernel Emissions Modeling System (SMOKE) • Emissions data processing methods integrated by high-performance computing sparse-matrix algorithms: • Version 1.5.1 (modified by RWDI - mobile as area source) • Emissions types supported: • Area • Mobile • Point source • Biogenic • Emissions inventory data conversion into CMAQ formatted emission files
Community Multiscale Air Quality Modeling System(CMAQ) • Version 4.2.1 (parallel version modified by RWDI and UC Riverside) • Horizontal and Vertical transport: Piece-wise Parabolic Method and Bott scheme • Horizontal and Vertical diffusion: spatially varying and K-theory • Cloud transport • Chemical mechanism: cb4_ae2_aq (43 species and 96 reactions) • Chemistry solver: Modified Euler Backward Iterative method • Aqueous phase chemistry: explicit 1-section • Particle size: modal • 1-way nesting • Wet and dry deposition
WhyCMAQ Version 4.2.1? • Extensively tested (by RWDI and UC Riverside) on Linux clusters • Enhancements and bug fixes in latest version: • Modifications of the vertical diffusion module to improve data locality to speed up computation • Bug fixed for the case where user wants to build a model for no aerosols • Changes to improve robustness for inexact I/O API (netCDF) file header data • Bug fixed for the heterogeneous N2O5 reaction in the aerosol module • Rate constant calculation for this reaction has been changed to use effective radius instead of diameter • Bug fixed in the contribution of N2O5 to total initial HNO3
Package for Analysis and Visualization of Environmental data (PAVE) Support IO/API (NetCDF) format, i.e., the Model-3/CMAQ I/O format • Version 2.1.1 • Run by scripts (not completely automatic yet…) on IRIX Machine • Types of plot:
Grid Analysis and Display System (GrADS) Already implemented by the UBC Weather Forecast Research Team, to produce high-quality post-processing plots/loops • Need to convert CMAQ output (NetCDF I/O API format) into GrADS format • Extensively tested on our system • Types of plots [MC2 12km grid, 90 hPa winds and RH (left), and surface T (right)]:
Grid Analysis and Display System (GrADS) • Types of plots (MC2 2km grid, wind vectors and convergence):
High-Performance Computing Linux Super-Cluster(Monster) • Owner: Geophysical Disaster Computational Dynamics Centre (GDCFD), UBC • Computational Processors: • 256 dual Pentium III: 1 GHz, • 1 GB RAM, 256 KB cache • Network: • Management: 10Mb Ethernet and • serial • Inter-processor comm. Myrinet 2000 • IO: 100Mb Ethernet • Parallel Computing: • Myricom implementation of MPICH, • PVM, etc. • Queuing system: • Open PBS with Maui scheduler
Computational Domain and Forecast Period • 12 km grid centered over the LFV • 70 x 89 x 15 grid points • Forecast period starts at 10 UTC • Run length: 50 Hours • Computational Time required • with 80 processors: ~11/2 hours
CMAQ Parallel Run Comparison • Note: • Run length: 50 Hours • with 40, 60 and 80 procs slower I/O settings compared to 20 procs => better scaling expected for future tests with optimal I/O settings
Future Work • Operational mode: • More performance tests with different number of processors • Run at 4km • Run at 2km • CMAQ driven by MM5 (@ 12, 4 and 2km) • CMAQ driven by WRF (@ 12, 4 and 2km) • …Ensemble forecast! • Results on the web • Implementation of a more recent CMAQ version • Research mode: • Ensemble forecast with the “Multi-Emission approach”
Acknowledgements • Grants support • Environment Canada (Colin di Cenzo) • Canadian Natural Science and Engineering Research Council • British Columbia Ministry of Water and Air Protection • Canadian Foundation for Climate and Atmospheric Science • Canadian Foundation for Innovation • BC Knowledge Development Fund • BC Ministry of Water Land and Air Protection • Parks Canada
MC2 => MM5 Conversion • Converter developed by RWDI • Conversion from MC2 output to pseudo MM5 format • Purpose: allow MCIP to process MC2 data • MC2 polar stereographic projection on pressure levels • Interpolation (inverse-squared distance) onto uniformly spaced MM5 grid in Lambert Conic Conformal projection, and sigma levels • Mapping and/or recalculation of parameters required by CMAQ not included in the MC2 output • Room of improvements to maintain mass conservation (partially performed by MCIP)