1 / 25

Outline

Report on CSU HPC (High-Performance Computing) Study Ricky Yu–Kwong Kwok Co-Chair, Research Advisory Committee ISTeC Ricky.Kwok@colostate.edu August 18, 2008 Presented to CSU CRAD. Outline. Motivation for the Study Interpretations of Findings Summary of Findings Representative HPC Activities

elani
Download Presentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Report on CSU HPC (High-Performance Computing) StudyRicky Yu–Kwong KwokCo-Chair, Research Advisory CommitteeISTeCRicky.Kwok@colostate.eduAugust 18, 2008Presented to CSU CRAD

  2. Outline • Motivation for the Study • Interpretations of Findings • Summary of Findings • Representative HPC Activities • CoGrid Job Submission System • HPC Forum (9/3/2008)

  3. Motivation for the Study • HPC is a 3rd way of conducting research and discovery • To assess current activities and future opportunities • To look at current systems at CSU • To make CSU more competitive in the research arena Study Motivation

  4. Interpretations of Findings • 37 responses 26 departments, 31 research areas • HPC is “hot” at CSU! • 1/3 of users have no immediate access to HPC resources rely on other departments within CSUOR other organizations outside CSU may be subject to strict resource restrictions Interpretations of Findings

  5. Interpretations of Findings • Scattered resources • Majority of users pessimistic about their HPC needs being met in future • Advanced visualization needs require enhancement • Need to “advertise” GRAD 510/511 courses • Space limitations for HPC hardware??? • Consultancy? • Affinity groups? Interpretations of Findings

  6. Summary of Findings • 37 responses 26 departments, 31 research areas • Besides CoGrid node, Bioinformatics cluster and HP Half Dome system, 8 additional HPC resources, e.g., (see Table 2 in the Report) • CIRA – 32-processor cluster • Chemical and Biological Engineering – 32-node Linux cluster • Mathematics – 42-node cluster • Some more new resources to come (e.g., new HP donations) Summary of Findings

  7. Usage of HPC Summary of Findings

  8. Nature of Usage Summary of Findings

  9. Algorithms Used Summary of Findings

  10. HPC Resource Used Summary of Findings

  11. HPC Needs Met? Current Future Summary of Findings

  12. Visualization Needs Met? Current Future Summary of Findings

  13. Compiler Used Summary of Findings

  14. Language Used Summary of Findings

  15. Be Involved in Future HPC Activities? Summary of Findings

  16. Aware of GRAD 510/511? Summary of Findings

  17. Required Follow-Up? Summary of Findings

  18. Presentation Outline • Motivation of the Study • Interpretations of Findings • Summary of Findings • Representative HPC Activities • CoGrid Job Submission System • HPC Forum

  19. Small-angle X-ray Scattering (SAXS) • To understand biological systems • Determine 3-D structure • Monte Carlo simulations for data modeling • Computationally intensive  HPC Representative HPC Activities

  20. Genomic Analysis • Study the evolution of functional DNA sequences • Comparative phylogenomic analysis • Highly computationally intensive cluster of Linux machines with 8 – 32 nodes Representative HPC Activities

  21. Proteomics • Large-scale study of proteins • Searching methods to analyze mass spectrometry data • Sequence matching • Access to large databases Representative HPC Activities

  22. Pattern Analysis • Investigate unexplained phenomena from high-dimensional, massive data sets • Mathematical theory  efficient algorithms  exploring, understanding, modeling massive data sets • PAL lab (Pattern Analysis Laboratory)  42-node cluster with 156 GB total memory, over 5 TB disk storage Representative HPC Activities

  23. CoGrid Job Submission System • Sharing of geographically distributed computational resources • Batch processing of jobs • Job description • Resource requirements • Execution • Monitoring CoGrid

  24. HPC Forum (9/3/2008) • Introduction and ISTeC Overview: HJ • Background on CSU’s HPC History and Activities: Pat Burns • Findings from ISTeC’s HPC Survey: Ricky • Advanced Networking Capabilities at CSU: Scott Baily, Director of ACNS • Educational Activities in HPC: Sanjay Rajopadhye and Wim Bohm (Computer Science) • Plan for an ISTeC Visualization Facility: HJ • NERSC and the INCITE Program as a Viable Alternative: Josh Ladd (Mathematics) • Open Discussion – Faculty Needs and Directions

  25. Summary • Motivation: To look at current systems in CSU and future HPC needs • Presented interpretations and findings of the survey • Discussed representative HPC activities • Introduced CoGrid job submission system, as a possible campus-wide resource management model • HPC Forum for further discussions

More Related