SPEC HPG Benchmarks for HPC Systems Kumaran Kalyanasundaram Kumaran Kalyanasundaram, PhD Chair, SPEC HPG for Manager, Performace Engineering SPECSGI High-Performance Group.

Download Report

Transcript SPEC HPG Benchmarks for HPC Systems Kumaran Kalyanasundaram Kumaran Kalyanasundaram, PhD Chair, SPEC HPG for Manager, Performace Engineering SPECSGI High-Performance Group.

SPEC HPG Benchmarks for
HPC Systems
Kumaran Kalyanasundaram
Kumaran Kalyanasundaram, PhD
Chair, SPEC HPG for
Manager,
Performace Engineering
SPECSGI
High-Performance
Group
SPEC HPG’s Purpose
The High Performance Group focuses on
the development of application
benchmarks for high performance
computers.
SPEC HPG
Founded in 1994 (Perfect Benchmarks
initiative became HPG).
Members from industry and academia.
Two active benchmarks - SPEC OMP &
SPEC HPC2002.
New MPI2006 benchmark currently under
development.
SPEC HPG Benchmark Suites
MPI2006
OMP2001
HPC96
OMPL2001
HPC2002
Founding of
SPEC HPG
Jan 1994
Oct 1995
June 2001
June 2002 Jan 2003
2006
SPEC OMP
Benchmark suite developed by SPEC
HPG (High Performance Group)
Benchmark suite for performance
testing of shared memory processor
systems
Uses OpenMP versions of SPEC
CPU2000 benchmarks and candidates
Why Did SPEC Choose
OpenMP?
Benchmark suite is focused on SMP
systems
OpenMP is a standard, and is applicable to
Fortran, C, and C++.
Directive based OpenMP allows serial
version to remain largely intact.
Quickest path to parallel code conversion.
OMP/CPU2000 Similarities
Same tools used to run the benchmarks
Similar run and reporting rules
Uses geometric mean to calculate overall
performance relative to a baseline system
Similar output format
SPEC OMP Benchmark
Principles
Source code based
Limited code and directive modifications
Focused on SMP performance
Requires a base run
 with no source modifications
 single set of compiler flags for all benchmarks
SPEC supplied tools required to run
benchmark
OMPM2001 Benchmarks
Benchmark
Lang
310.wupwise_m F90
312.swim_m
F90
314.mgrid_m
F90
316.applu_m
F90
318.galgel_m
F90
320.equake_m C
324.apsi_m
F90
326.gafort_m
F90
328.fma3d_m
F90
330.art_m
C
332.ammp_m
C
What it does
SU(3) HE physics
Shallow water model from NCAR
Multigrid for EM problems
Fluid dynamics partial LU decomposition
Fluid dynamics, Galerkin FE
Earthquake dynamics
Lake weather model
Genetic algorithm
Finite element mechanics
Image recognition
Molecular dynamics
OMP vs CPU2000
Characteristic
Max. working set
Memory needed
Benchmark runtime
Language
Focus
System type
Runtime
Runtime 1 CPU
Run modes
Number benchmarks
Iterations
Source mods
Baseline flags
Reference system
CPU2000
200 MB
256 MB
30 min @ 300 MHz
C, C++, F77, F90
Single CPU
Cheap desktop
24 hours
24 hours
Single and rate
26
Median 3 or more
Not allowed
Max of 4
1 CPU @ 300 MHz
OMPM2001
1.6 GB
2 GB
5 hrs @ 300 MHz
C, F90, OpenMP
< 16 CPU system
MP workstation
34 hours
140 hours
Parallel
11
Worst of 2, median of 3
Allowed
Any, same for all
4 CPU @ 350 MHz
OMPL2001
6.5 GB
8 GB
9 hrs @ 300 MHz
C, F90, OpenMP
> 16 CPU system
Engineering MP sys
75 hours
1000 hours
Parallel
9
or more
Allowed
Any, same for all
16 CPU @ 300 MHz
Program Memory Footprints
wupwise
swim
mgrid
applu
galgel
equake
apsi
gafort
fma3d
art
ammp
OMPM2001
(Mbytes)
1480
1580
450
1510
370
860
1650
1680
1020
2760
160
OMPL2001
(Mbytes)
5280
6490
3490
6450
5660
5030
1700
5210
10670
SPEC HPC2002 Benchmark
 Full Application benchmarks
(including I/O) targeted at HPC
platforms
 Serial and parallel (OpenMP and/or
MPI)
 Currently three applications:
SPECenv: weather forecast
SPECseis: seismic processing, used in
the search for oil and gas
SPECchem: comp. chemistry, used in
chemical and pharmaceutical
industries (gamess)
 All codes include several data sizes
SPEC MPI2006
An application benchmark suite that
measures CPU, memory bw, interconnect,
compiler, MPI performance.
Search program is open till March 31st, 06
Candidate codes in the areas of Comp.
Chemistry, weather forecasting, HE
Physics, Oceanography, CFD, etc.
Future Goals
Very large data sets for MPI2006.
Follow onto SPEC OMPM(L)2001.
Update SPEC HPC2002 suite.