160 likes | 306 Views
Nesting Ocean Model for parallel vector processors. Miyazawa, Y. (IGCR/FRSGC) et al. Prediction and estimation of effects on coastal ocean of climate variation ----- abnormal sea level, Kyucho. Prediction of maritime disaster ----- oil spill. High-resolution coastal ocean model
E N D
Nesting Ocean Model for parallel vector processors Miyazawa, Y. (IGCR/FRSGC) et al. Prediction and estimation of effects on coastal ocean of climate variation ----- abnormal sea level, Kyucho Prediction of maritime disaster ----- oil spill High-resolution coastal ocean model nested in a basin-scale ocean model
Contents 1. Nesting Model for Kuroshio simulation 2. The Earth Simulator 3. Modification of the model code for the Earth Simulator
Nesting POM driven by the NCAR daily data High- resolution model 1/12 deg. (10km) Low- resolution model 1/4 deg. (30km)
Nesting grid One-way nesting Bilinear interpolation 3x3 grids
Prediction of Kuroshio path (initialized by DA of T/P in 1993.01) Observation (1993.03-1993.04) Simulation
Nesting of another coastal model (by a TIT group) Part of the Kuroshio intrudes the Tokyo Bay 1/48 deg.(2.5km) grid
Earth Simulator Developed by JAERI/JAMSTEC/NASDA 8 vector processors(8x8Gflops) share main memory(16GB) 1node 640 nodes are connected by a single-stage crossbar network
Speed up of the POM code • Vectorization: 99.7 % • Parallelization by MPI (SPMD) • domain decomposition method ----- latitudinal direction • initialization for MPI and subroutines for communication are inserted in the POM
Efficiency of the code Low-resolution model: 543x282x21 On NEC/SX-5
Efficiency of the code High-resolution model: 758x530x35 On NEC/SX-5
Nesting by MPI Low-resolution model High-resolution model set_openbnd mkbnd send_nestbuf recv_nestbuf MPI MPI Network
Problems • More tests for the nesting ocean model • Asynchronous coding of MPI • Development of higher-resolution ocean models for Earth Simulator
Another POM codes for Earth SimulatorES-OGCM (by T. Kagimoto) • Global Model • Atmosphere/Ice/Ocean Couple Model • Fortran 90/95 • Parallelization by MPI: asyncronous communication