1 / 14

TeraGrid: ASTA Program Status June 2006

TeraGrid: ASTA Program Status June 2006. PSC. PU. UC/ANL. NCAR (mid-2006). IU. NCSA. ORNL. TACC. SDSC. Liquid Rocket Combustion Instability Analysis with Coaxial Injector Modeling. PI Heister (PU) ;TG Embed Kim (NCSA) Completed 03/31/06

rainer
Download Presentation

TeraGrid: ASTA Program Status June 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TeraGrid: ASTA Program Status June 2006 PSC PU UC/ANL NCAR (mid-2006) IU NCSA ORNL TACC SDSC

  2. Liquid Rocket Combustion Instability Analysis with Coaxial Injector Modeling • PI Heister (PU) ;TG Embed Kim (NCSA) • Completed 03/31/06 • Shear/Swirl coaxial injector modeling enabled. Designed TeraGrid portal framework. • Used DTF (NCSA+SDSC) • The TeraGrid will allow the model’s user community to launch realistic what-if simulations on many systems, including migration of jobs from one system to another.

  3. CMS on the TeraGrid • PI Newman (Caltech) TG Embed Milfeld (TACC) • Ongoing to 9/30/06 • Completing the Physics Technical Design Report for the CMS experiment at LHC • Complex workflow consisting of multiple execution stages running thousands of serial jobs via GridShell on DTF and Lonestar, very large datasets stored on SDSC HPSS and staged to execution sites prior to job runs. • Winner, CLADE 2006 Best Paper “

  4. ENZO: Cosmic Simulator • PI Norman (UCSD) TG Embed Harkness (SDSC) • Completed 3/31/06 • Largest-ever simulations of metal enrichment and turbulent shock fronts in intergalactic matter. Pioneering AMR run sets stage for design verification of Large Scale Synoptic Telescope. • Generated approx. 100 TBytes on DTF, DataStar, Cobalt, TCS. Transferred to SDSC digital library via GridFTP, DMOVER and GPFS_WAN.

  5. Simulated Pore Interactive Computing Experiment • PI Coveney (UCL) • TG Lead O’Neal (PSC) • Completed 3/31/06 • WINNER, SC05 HPC ANALYTICS CHALLENGE • Understand the translocation of biomolecules across protein pores by interactive Molecular Dynamics simulations including haptic devices to get an estimate of force values & to determine suitable constraints to place. • Checkpoint and cloning of simulations to TG systems: TCS, DTF, Lonestar. • Real time visualization at UC/ANL

  6. Web Based Distributed Computing Environment for Nanotechnology • PI Jakobsson (UIUC), TG Embed Parker (NCSA) • Completed 3/31/06 • A simulation portal and data archiving service built on top of Grid services • Used DTF, Tungsten Small molecule channel

  7. PI Brady (Cornell), TG Embed Chukkapali (SDSC) Completed 3/31/06 Enhanced parallel CHARMM to simulate >800K atoms Used IBM P665, P690, DTF to make significant progress in understanding enzymatic breakdown of cellulose. Fuel From Plants -- NREL

  8. Cross-Site Runs and Computational Steering on the TeraGrid • PI Karniadakis (Brown), TG Embed O’Neal (PSC) • Completed 3/31/06 • Enables simulations of the full human arterial tree by a novel, latency hiding algorithm. • Detailed 3D simulations at arterial bifurcations conducted across DTF, TCS, Lonestar using mpich-G2, with Real-Time Visualization at UC/ANL.

  9. CyberShake • PI Olsen (SDSU) / David Okaya (USC), TG Embeds Cui (SDSC), Reddy (GIG/PSC) • Largest and most detailed earthquake simulation of the southern San Andreas fault, • First calculation of physics-based probabilistic hazard curves for Southern California using full waveform modeling rather than traditional attenuation relationships, • So far used DTF and DataStar; WAN-GPFS, GridFTP, CondorG, and RLS. • Workflow tools enable work at a scale previously unattainable by automating the very large number of programs and files that must be managed. Major Earthquakes on the San Andreas Fault, 1680-present 1906 M 7.8 1857 M 7.8 1680 M 7.7 Continuing until 3/31/07 large-scale simulation of a magnitude 7.7 seismic wave propagation on the San Andreas Fault, generating more than 47 TBs of output

  10. PI Ellisman (UCSD), TG Embed Majumdar (SDSC)BIRN and Telescience until 9/30/06Parallel code for image reconstruction algorithms using DTF and DataStar Expectation/Maximization (EM) Segmentation Brigham and Women’s Hospital Transform Based Backprojection for Volume Reconstruction(TxBR) University of California, San Diego Large Deformation Diffeomorphic Metric Mapping (LDDMM) John Hopkins University

  11. VORTONICS completed 3/31/06 • PI Boghosian (Tufts) TG Embeds Guiang (TACC), Insley (UC/ANL), O’Neal (GIG/PSC) • Physical challenges: Reconnection and Dynamos • Vortical reconnection governs establishment of steady-state in Navier-Stokes turbulence • Magnetic reconnection governs heating of solar corona • The astrophysical dynamo problem • Exact mechanism and space/time scales unknown and represent important theoretical challenges • Computational challenges: Enormous problem sizes, memory requirements, and long run times • requires relaxation on space-time lattice of 5-15 Terabytes. • Requires geographically distributed domain decomposition (GD3): DTF, TCS, Lonestar • Real time visualization at UC/ANL Homogeneous turbulence driven by force of Arnold-Beltrami-Childress (ABC) form

  12. Searching for new crystal structures on the TeraGrid • PI Michael Deem (Rice) • TG Lead Walker (TACC) and Cheeseman (Purdue) • Ongoing to 09/30/06 • Searching for new 3-D zeolite crystal structures in crystallographic space • Require to run 10000s of serial jobs through TeraGrid. • Using MyCluster/GridShell to aggregate all the computational capacity on the TeraGrid for accelerating search.

  13. Mixing Across the Turbulent Shear Layer in a Star • PI: Woodward and Porter (U MN), TG: R Reddy (PSC/GIG) • Pioneering explorations of the large-scale effects of small-scale turbulence • PPM code on 1025 PEs of XT3 at PSC sustains 1 TFLOP • Application sustained ~120 MB/s over 1 lambda using PDIO for visualization/steering

  14. List of ASTA Projects by PI Arterial Karniadakis (Brown U) CTS completed O’Neal (GIG/PSC) Vortonics Boghosian (Tufts U) CTS completed O’Neal (GIG/PSC) SPICE Coveney (UCL) CHE completed O’Neal (GIG/PSC) ENZO Norman (UCSD) AST completed Harkness (SDSC) Injector Heister (Purdue) ASC completed Kim (NCSA) MD-Data Jakobsson (UIUC) BIO completed Parker (NCSA) NREL Brady (Cornell) BIO completed Chukkapali (SDSC) SCEC Olsen (SDSU) GEO in progress Cui (SDSC), Reddy (GIG/PSC) BIRN Ellisman (SDSC) BIO in progress Majumdar (SDSC) CMS Newman (Caltech) PHY in progress Milfeld (TACC) CIG Gurnis (Caltech) GEO in progress Gardner (PSC) EarthScope Pavlis (IU) GEO in progress Sheppard (IU) Crystal Deem (Rice) PHY in progress Walker (TACC) Tstorms Droegemeier (OU) ATM in progress O’Neal (GIG/PSC) Turbulence Woodward (U Minn) ASC in progress Reddy (PSC) Nemo3D Klimeck (Purdue) ENG proposed Raymond (GIG/PSC) Epidemiology Barrett (VA Polytechnic), Cuticchia (Duke) BCS proposed Marcusiu (NCSA) Pulmonary Immunity Benos (Pitt) BIO proposed Raymond (GIG/PSC) Demography Lansing (U Arizona) DBS proposed Majumdar (SDSC), Gomez (PSC) Multidimensional Microscope Imaging Luby-Phelps (UT Southwestern Medical Center at Dallas) BIO proposed Hempel (TACC)

More Related