380 likes | 590 Views
The Brazilian Effort on BRAMS and OLAM Pedro L. da Silva Dias LNCC/MCT e IAG/USP Workshop on Weather and Seasonal Climate Modeling at INPE - 08-10 December 2008. RAMS origin: Colorado State University - William Cotton - Greg Tripoli: end 70 ’s early 80’s Cloud Microphysics
E N D
The Brazilian Effort on BRAMS and OLAM Pedro L. da Silva Dias LNCC/MCT e IAG/USP Workshop on Weather and Seasonal Climate Modeling at INPE - 08-10 December 2008
RAMS origin: Colorado State University - • William Cotton - Greg Tripoli: end 70’s early 80’s • Cloud Microphysics • Cloud Dynamics • Mesoescale Model - Roger Pielke - Virginia University • Fusion of both models => RAMS (Regional Atmospheric Modeling System)
Mesoscale Modeling at IAG/USP • 1980’s - role of sea-breeze in São Paulo - regional climate - impact of air pollution => lead to fairly complex physics - urban processes, vegetation, topography (numerical challenges) • Fundamental problem: lack of computer power • Theoretical studies (80’s ) - instability lines - heat sources • Late 80’s - Elmar Reiter’s PE model hydrostatic used at CPTEC and USP • Semi-lagrangean models: remote impact of hurricanes; • Andes effect: blocking effect - eta coordinate, role of LLJ ‘s
Computer power limitation -> more emphasis on observational studies; • RADASP (IPMET-UNESP/USP/INPE): July/81, Jan/Fev 82 ,Jan/89: PBL and convection - mesoscale systems • ABLE 2a (1985) e 2b (1987) = > atmospheric chemistry - Amazon, biomass burning • ABRACOS (land use change) - FLUAMAZON ---> LBA (93) • ABLE is a turning point: beginning of integrated model activities - concept of tracers (radon, CO, aerossols); Beginning of trajectory analysis • ABLE lead to more observational studies on urban aerosols and urban chemistry in the late 90’s;
At the end of the 80 ‘s (CONVEX computer at USP - vector and 2 processors); • Falling behind modeling activities…. • Decision: use a complete model and work on modules: • MM5, JMA mesoscale model,…RAMS? • 1989: Bill Cotton visits FUNCEME, CPTEC e USP; • Decision: implement RAMS. • Strong connection with observational work: model validation (ABRACOS, LBA) and latter a strong connection with urban air quality issues; • 1995 - beginning of regional forecasting - 40km resolution (CPTEC at this time ran ETA at 80km); IBM SP2 with 16 processors in 1997 - boosts operational capability - CPTEC seasonal climate downscaling in 1999.
Modeling with RAMS: hurricanes, local circulation in São Paulo, São Francisco Valley, NE Brazil, Instability lines in the Amazon, intense cyclones, land use change, impact of pollution sources, convective parameterizations; vegetation (SIB)…. • Paralelism: end of the 90’s -> FINEP project (hardware - PC cluster) - CPTEC role; • Large number of students - > use of RAMS spread to several universities in Brazil (UFRJ, UFPb, FURGS, UFPA,…) • FUNCEME begins operational use of RAMS for climate downscaling and weather forecasting • SIMEPAR - surface data assimilation (FINEP);
FINEP - BRAMS (Brazilian Developments in RAMS) • Versão 3.0 • Based on RAMS 5.04 - ASTER • Maintained by CPTEC • New Functionalities: • Shallow Cumulus • Deep Cumulus – “Grell-Ensemble” • Soil Moisture Inicialization • SIB2 in addition to LEAF • Surface data assimilation with data quality control • CATT – biomass burning emission module and transport (plus urban sources).
BRAMSNET • Network of BRAMS users and developers • Inicial partners: • UFCG • UFRJ • CPTEC • USP • SOMAR • ITAUTEC • FURG GBRAMS GRID - UFRGS, CPTEC,IAG - climate applications
New implementations: • Precipitation assimilation - required by some users (SIVAM) - better short range forecasts; • Urban energy balance and transfer - TEB: required to improve model validation against surface observations; • Calibration of “Grell Ensemble” with precipitation data; • New options for the radiative processes in the presence of gases and aerossols - space and time variation - CARMA; • Interaction cloud/radiation - short wave - (parameterized shallow clouds) - need to improve metric of validation based on fit to surface radiation measurements; • New options for dry turbulence; (need for improvement of Td diurnal cycle) • New data assimilation module - based on PSAS/CPTEC • Simplified photochemistry - 2004 (product of research project with CETESB); • Full photochemistry module (CPTEC - other presentation) • Coupling with dynamical vegetation GEMTM - furture IBIS • Coupling with ocean model (POM) and more recently with mixed layer model • Coupling with surface hydrology - Sao Francisco, Rio Grande, Uruguai - Pantanal
Code robustness (Jairo Panetta and team): • Code originally developed by researchers; • Fundamental rules of software engineering • Parallel efficiency : • Challenge: eficiency in vector computers; massive parallelism - shared and distributed memory… • BRAMS community is growing
NEC SX-6 NEC-SX6 with 12 nodes, 96 processors in CPTEC • Challenge - Efficient use of High Performance Computing - HPC • New generation of HPC machines: • architecture • Massively parallel and vector • Visualization of of large data sets (3D animation) • Assistance to “poor mortal “users… Larger clusters: 1100 processors at CPTEC
Towards Production Code Effective Portability among Vector Machines and Microprocessor-Based Architectures Alvaro Luiz Fazenda Eduardo Hidenori Enari Luiz Flavio Rodrigues Jairo Panetta INPE/CPTEC
Initial final Contribution of Jairo Panetta - 2007
What is behind the success of BRAMS? • Link to observational work!!!! • Ex. LBA, air pollution programs, micromet tower program • Operational use for regional forecasting
Measurements with the INPE Bandeirantes aircraft from 11 to 13 August 1999 MRSP CO O3 CCN CCN
Projeto Financiado pela FAPESP: coordenado pelo Dr. H. Rocha (IAG/USP) Participação: Inst. Agronômico de Campinas e UNICAMP
Rocha 2001 4 3 1 2
Modelling Earth System atmosphere ocean+hydrology + ice Soil Vegetation Gases, aerosols
RAMS/BRAMS is not flux conservative due to:1. Boussinesq approximation2. Advection operator3. Failure to average divergence over small time-split steps4. Grid nesting applied to primary variables rather than to fluxesThis is not a major problem if lateral boundaries are open Several users require global domains: The Ocean-Land-Atmosphere Model (OLAM): For OLAM global domain, require full conservationRe-cast governing equations in conservation law form
The Ocean-Land-Atmosphere Model (OLAM): A re-formulation of RAMS for global modeling Based on a presentation by Robert Walko – Duke University BRAMS Workshop – May 2006 - CPTEC
OLAMEquations: Momentum conservation Mass conservation Energy conservation Scalar mass conservation Equation of state
Apply Gauss Divergence Theorem and integrate over Finite Volumes: , Discretized equations: .
Discretized equations are applied on Cartesian grid with origin at Earth center , Grid cell surfaces are not aligned with (x,y,z) coordinates, but each grid cell surface is parallel or perpendicular to local gravity .
Other features… • Coded using F90 modules and data structures like RAMS 6.0 • Build/compilation procedure same as RAMS 6.0 • ‘OLAMIN’ namelist file in same form as RAMSIN • Refined mesh areas specified by location, size, shape in OLAMIN • Global spherical, limited area spherical, limited area cartesian geometry options • Vertical K at W levels: most natural for evaluation and application • Implicit vertical diffusion solves for fluxes • Implicit surface momentum flux • Graphics done from model itself: No separate REVU • ‘PLOTONLY’ run can loop through multiple files • Orthographic, lat/lon, polar stereographic plot projection options • Vertical cross section plots in any direction through field
Where are we going: • Avoid use of different models for each spatial scale; • Multiscaling modeling; • Numerical challenge: efficiency/precision • Example:Global grid structure in OLAM - successor of RAMS/BRAMS
OLAM – Rainfall OLAM – chuva acumulada kg/m^2 Satelite GOES 1999, 02 Janeiro 17:45 UTC
OLAM - in experimental operation at CPTEC - • Ocean model: Hycom http://oceanmodeling.rsmas.miami.edu/hycom/) • Challenges: numerical efficiency - • dynamical core- implicit schemes • Sharing physics with BRAMS • Transfer BRAMS functionalities
Conclusions: • Definition of functionalities: users (research/operational); • No matter what model one chooses - critical to have close ties with experimental work; • Friendly interface for users; • Operational use (optimization function: computational cost, precision, evaluation against observations); • Parallel efficiency is critical! • Model validation - needs well defined metrics - appropriate for the scales; • Team work - need to understand users needs; • Persistence!!! • and resources.