100 likes | 284 Views
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO. Panel questions: 1 response per question Limit length to 1 slide. Panel Format. <insert panel format here> Sequence – Alphabetical Few bullet points for each question, each participant can address/supplement it
E N D
IDC HPC USER FORUM Weather & Climate PANELSeptember 2009Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide
Panel Format • <insert panel format here> • Sequence – Alphabetical • Few bullet points for each question, each participant can address/supplement it • After each panel member has finished it, we move on to the next question • Moderators can adjust depending on discussions and time constraints
Panel Members • <insert panel moderators here> • Steve Finn & Sharan Kalwani • <insert panel participants & affiliation here>
Q1. Relative Importance of data/resolution/micro-physics! • To a certain degree, data assimilation, resolution and physics are all important (non-linear system of eqns.) • Metric dependent: quantitative precipitation skill, low-level winds, clouds, forecast length (nowcast vs climate) • For typical Navy metrics (winds,visibility,waves,clouds) • Data assimilation is essential (accurate synoptic and mesoscale initial state, spin-up of physics) • Physics: Boundary, surface layer, cloud/convection • Resolution • Sufficient to capture key geographic features • High enough to avoid bulk parameterizations Convection (Dx~2-4 km), turbulence (Dx~20-200 m) • Predictability: tradeoffs between ensembles & Dx
Q2. Adaptive Mesh or Embedded Grids: their impact… • Please discuss the use of Adaptive Mesh or embedded grids ( urban area, terrain impact on advection..) and how would future increased use of this, impact system requirements such as system interconnects? • Adaptive meshes are challenging • Time step and run time (for operations) issues • Physical parameterizations (resolution dependence) • Mesh refinement needs to consider complex multi-scale interactions (difficulty in determining hi-res areas). • Nested grids currently used in Navy mesoscale model • Moving meshes to follow features (hurricanes), ships • Impact on system requirements (interconnects) • Load balance may be an issue (decomposition) • Cores as a function of grid points (communication)
Q3. Ratio of Date to Compute: Background… Problem dimension: nNest·nx·ny·nz·nVariables·nTimeLevels·Precision Today: 5x100x100x50x50x3x4 Future: 5x100x100x100x100x3x8 • What are your Bytes per Flop for future requirements? * This is a relatively *low* ratio compared to some other benchmarks? Geerd Hoffman of DW said 4Bytes/Flop @ the Oct 2007 HPC User Forum in Stuttgart)
Q4. Open Source codes in the community… • What is the Importance and impact of open source / community code applications such as CCSM, WRF,….? • Information assurance issues for DoD • Open source may be problematic for operations • Navy open source code can be useful • Physics (from other institutions, agencies) • Framework (Earth System Modeling Framework) • Post processing, graphics etc. • COAMPS code (has > 350 users) • Fosters collaboration, and leverages expertise (within and beyond CWO) among agencies, universities.
Q5. Data and collaboration, formats, future needs… • What is the level of collaboration and standardization of data management, observational & results data bases: such as use of common file formats, web based data, etc. What is needed in the future? • Common file standards for exchange among agencies (grib2 for operations, some netcdf for research). • Static databases (land characteristics, etc.) are commonly shared, but often not standardized. • Standardized observational databases (common format with other agencies is being considered) • Future: • Much larger databases • Larger need for standardized output (input) for community shared projects (TIGGE, HFIP, etc.)
Q6. Ensemble model: your experiences… • Has the use of ensemble model had any positive or negative impact in reducing the code scaling requirements? • Limiting factor is how well deterministic model scales • Ensembles are embarrassingly parallel and should perform well on large multi-core clusters. • Need efficient I/O to exchange state information between the model output and post processing (DA) • Ensemble approaches present some challenges for post processing (archival) and file management.
Q7. Obligatory Question: (no pun intended!)Cloud computing: your views (unfiltered)… • What is your current / future interest in grid or cloud computing ? • Grid / cloud computing may potential work well for ensembles, although there are obvious challenges (I/O) • Domain decomposition across the grid, could present big challenges • Models require huge input datasets and produce large output datasets (persistent data storage) • Model paradigm may have to be re-visited (communication, latency between nodes might not be consistent). • Information assurance could be an issue (particularly for DoD operations).