360 likes | 471 Views
spec. spec. spec. spec. Current Efforts of SPEC HPG Application Benchmarks for High Performance Computing IPSJ SIGMPS 2003 Matthias Mueller High Performance Computing Center Stuttgart mueller@hlrs.de
E N D
spec spec spec spec Current Efforts of SPEC HPG Application Benchmarks for High Performance Computing IPSJ SIGMPS 2003 Matthias MuellerHigh Performance Computing Center Stuttgart mueller@hlrs.de Kumaran Kalyanasundaram, G. Gaertner, W. Jones, R. Eigenmann, R. Lieberman, M. van Waveren, and B. Whitney SPEC High Performance Group
Outline • What is SPEC and SPEC HPG? • Why do we need benchmarks? • Benchmarks currently produced by SPEC HPG • What do we need for the future?
What is SPEC? The Standard Performance Evaluation Corporation (SPEC) is a non-profit corporation formed to establish, maintain and endorse a standardized set of relevant benchmarks that can be applied to the newest generation of high-performance computers. SPEC develops suites of benchmarks and also reviews and publishes submitted results from our member organizations and other benchmark licensees. For more details see http://www.spec.org
SPEC Members • Members:3DLabs * Advanced Micro Devices * Apple Computer, Inc. * ATI Research * Azul Systems, Inc. * BEA Systems * Borland * Bull S.A. * Dell * Electronic Data Systems * EMC * Encorus Technologies * Fujitsu Limited * Fujitsu Siemens * Fujitsu Technology Solutions * Hewlett-Packard * Hitachi Data Systems * IBM * Intel * ION Computer Systems * Johnson & Johnson * Microsoft * Mirapoint * Motorola * NEC - Japan * Network Appliance * Novell, Inc. * Nvidia * Openwave Systems * Oracle * Pramati Technologies * PROCOM Technology * SAP AG * SGI * Spinnaker Networks * Sun Microsystems * Sybase * Unisys * Veritas Software * Zeus Technology *
SPEC HPG = SPEC High-Performance Group • Founded in 1994 • Mission: To establish, maintain, and endorse a suite of benchmarks that are representative of real-world high-performance computing applications. • SPEC/HPG includes members from both industry and academia. • Benchmark products: • SPEC OMP (OMPM2001, OMPL2001) • SPEC HPC2002 released at SC 2002
Currently active SPEC HPG Members • Fujitsu • HP • IBM • Intel • SGI • SUN • UNISYS • University of Purdue • University of Stuttgart
Where is SPEC Relative to Other Benchmarks ? There are many metrics, each one has its purpose Raw machine performance: Tflops Microbenchmarks: Stream Algorithmic benchmarks: Linpack Compact Apps/Kernels:NAS benchmarks Application Suites: SPEC User-specific applications: Custom benchmarks Computer Hardware Applications
Why do we need benchmarks? • Identify problems: measure machine properties • Time evolution: verify that we make progress • Coverage:Help the vendors to have representative codes: • Increase competition by transparency • Drive future development (see SPEC CPU2000) • Relevance: Help the customers to choose the right computer
SPEC OMP • Benchmark suite developed by SPEC HPG • Benchmark suite for performance testing of shared memory processor systems • Uses OpenMP versions of SPEC CPU2000 benchmarks • SPEC OMP mixes integer and FP in one suite • OMPM is focused on 4-way to 16-way systems • OMPL is targeting 32-way and larger systems
SPEC OMP Applications Code Applications Language linesammp Molecular Dynamics C 13500 applu CFD, partial LU Fortran 4000 apsi Air pollution Fortran 7500 art Image Recognition\ neural networks C 1300 fma3d Crash simulation Fortran 60000 gafort Genetic algorithm Fortran 1500 galgel CFD, Galerkin FE Fortran 15300 equake Earthquake modeling C 1500 mgrid Multigrid solver Fortran 500 swim Shallow water modeling Fortran 400 wupwise Quantum chromodynamics Fortran 2200
SPEC OMP Results • 66 submitted results for OMPM • 24 submitted results for OMPL
SPEC HPC2002 Benchmark • Full Application benchmarks(including I/O) targeted at HPC platforms • Currently three applications: • SPECenv: weather forecast • SPECseis: seismic processing, used in the search for oil and gas • SPECchem: comp. chemistry, used in chemical and pharmaceutical industries (gamess) • Serial and parallel (OpenMP and/or MPI) • All codes include several data sizes
SPEC ENV 2002 • Based on the WRF weather model, a state-of-the-art, non-hydrostatic mesoscale weather model, see http://www.wrf-model.org • The WRF (Weather Research and Forecasting) Modeling System development project is a multi-year project being undertaken by several agencies. • Members of the WRF Scientific Board include representatives from EPA, FAA, NASA, NCAR, NOAA, NRL, USAF and several universities. • 25.000 lines of C and 145.000 lines of F90
SPEC ENV2002 • Medium data set: SPECenvM2002 • 260x164x35 grid over Continental United States • 22km resolution • Full physics • I/O associated with startup and final result. • Simulates weather for a 24 hour period starting from Saturday, November 3nd, 2001 at 12:00 A.M. • SPECenvS2002 provided for benchmark researchers interested in smaller problems. • Test and Train data sets for porting and feedback. • The benchmark runs use restart files that are created after the model has run for several simulated hours. This ensures that cumulus and microphysics schemes are fully developed during the benchmark runs.
SPECenv execution models on a Sun Fire 6800 Medium scales better OpenMP best for small size MPI best for medium size
SPECseis execution models on a Sun Fire 6800 Medium scales better OpenMP scales better than MPI
SPECchem execution models on a Sun Fire 6800 Medium shows better scalability MPI is better than OpenMP
Current and Future Work of SPEC HPG • SPEC HPC: • Update of SPECchem • Improving portability, including tools • Larger datasets • New release of SPEC OMP: • Inclusion of alternative sources • Merge OMPM and OMPL on one CD
Adoption of new benchmark codes • Remember that we need to drive the future development! • Updates and new codes are important to stay relevant • Possible candidates: • Should represent a type of computation that is regularly performed on HPC systems • We currently examine CPU2004 for candidates • Applications from Japan are very welcome !!!Please contact SPEC HPG or me <mueller@hlrs.de> if you have a code for us.
Conclusion and Summary • Results of OMPL and HPC2002: • Scalability of many programs to 128 CPUs • Larger data sets show better scalability • Best choice of programming model (MPI,OpenMP, hybrid) depends on: • Hardware • Program • Data set size • SPEC HPC will continue to update and improve the benchmark suites in order to be representative of the work you do with your applications!
SPEC Members • Members:3DLabs * Advanced Micro Devices * Apple Computer, Inc. * ATI Research * Azul Systems, Inc. * BEA Systems * Borland * Bull S.A. * Dell * Electronic Data Systems * EMC * Encorus Technologies * Fujitsu Limited * Fujitsu Siemens * Fujitsu Technology Solutions * Hewlett-Packard * Hitachi Data Systems * IBM * Intel * ION Computer Systems * Johnson & Johnson * Microsoft * Mirapoint * Motorola * NEC - Japan * Network Appliance * Novell, Inc. * Nvidia * Openwave Systems * Oracle * Pramati Technologies * PROCOM Technology * SAP AG * SGI * Spinnaker Networks * Sun Microsystems * Sybase * Unisys * Veritas Software * Zeus Technology * • Associates:Argonne National Laboratory * CSC - Scientific Computing Ltd. * Cornell University * CSIRO * Defense Logistics Agency * Drexel University * Duke University * Fachhochschule Gelsenkirchen, University of Applied Sciences * Harvard University * JAIST * Leibniz Rechenzentrum - Germany * Los Alamos National Laboratory * Massey University, Albany * NASA Glenn Research Center * National University of Singapore * North Carolina State University * PC Cluster Consortium * Purdue University * Queen's University * Seoul National University * Stanford University * Technical University of Darmstadt * Tsinghua University * University of Aizu - Japan * University of California - Berkeley * University of Edinburgh * University of Georgia * University of Kentucky * University of Illinois - NCSA * University of Maryland * University of Miami * University of Modena * University of Nebraska - Lincoln * University of New Mexico * University of Pavia * University of Pisa * University of South Carolina * University of Stuttgart * University of Tsukuba * Villanova University * Yale University *
SPEC ENV2002 – data generation • The WRF datasets used in SPEC ENV2002 are created using the WRF Standard Initialization (SI) software and standard sets of data used in numerical weather prediction. • The benchmark runs use restart files that are created after the model has run for several simulated hours. This ensures that cumulus and microphysics schemes are fully developed during the benchmark runs.