100 likes | 203 Views
IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO. Panel questions: 1 response per question Limit length to 1 slide. Panel Format. <insert panel format here> Sequence – Alphabetical Few bullet points for each question, each participant can address/supplement it
E N D
IDC HPC USER FORUM Weather & Climate PANELSeptember 2009Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide
Panel Format • <insert panel format here> • Sequence – Alphabetical • Few bullet points for each question, each participant can address/supplement it • After each panel member has finished it, we move on to the next question • Moderators can adjust depending on discussions and time constraints
Panel Members • <insert panel moderators here> • Steve Finn & Sharan Kalwani • <insert panel participants & affiliation here>
Q1. Relative Importance of data/resolution/micro-physics! • Please quantify the relative importance of improvements in observational data, grid resolution, cloud micro-physics for future forecast accuracy ? • For prediction • Observations and understanding of observations • Data assimilation • Ensembles • Physics • Scale appropriate • Sensitivities • Superparameterizations • Resolution • Explicitly resolve scales • Convergence studies, feed back to prediction
Q2. Adaptive Mesh or Embedded Grids: their impact… • Please discuss the use of Adaptive Mesh or embedded grids ( urban area, terrain impact on advection..) and how would future increased use of this, impact system requirements such as system interconnects? • Nesting • Domains interact sequential • Scatter-gather 3D fields between domains • Spatial refinement • In place, adding cells • Temporal refinement (future) • Adaptivity (future) • Coupling • Load balancing, bandwidth
Q3. Ratio of Date to Compute: Background… • What are your Bytes per Flop for future requirements? • Assuming the question means “bytes of main memory per sustained flop/s” (D. H. Bailey) • Current – lots of headroom ~2000 ops per cell per second ~800 bytes (4 byte floats) per cell = 0.4 bytes per op • Future • Resolution follows 3/4 rule (2/3 in practice) • Adding physics or chemistry should not upset this ratio * This is a relatively *low* ratio compared to some other benchmarks? Geerd Hoffman of DW said 4Bytes/Flop @ the Oct 2007 HPC User Forum in Stuttgart)
Q4. Open Source codes in the community… • What is the Importance and impact of open source / community code applications such as CCSM, WRF,….? • Common modeling tool to foster interaction, outreach, and ultimately advancement of the science • Relevant HPC application benchmarks
Q5. Data and collaboration, formats, future needs… • What is the level of collaboration and standardization of data management, observational & results data bases: such as use of common file formats, web based data, etc. What is needed in the future? • Scientific and technical interoperability for multi-model simulation systems • Metadata formalisms, conventions, infrastructure: • Earth System Curator (www.earthsystemcurator.org) • Earth System Grid (www.earthsystemgrid.org)
Q6. Ensemble model: your experiences… • Has the use of ensemble model had any positive or negative impact in reducing the code scaling requirements? • Ensembles have a positive effect on scaling because they are trivially scalable
Q7. Obligatory Question: (no pun intended!)Cloud computing: your views (unfiltered)… • What is your current / future interest in grid or cloud computing ? • Computational grids are not feasible for tightly coupled parallel applications • Reproducibility across platforms also an issue • Data and observing grids are useful • WRF is used in LEAD (portal.leadproject.org)