From EarlJoseph@aol.com Tue May 29 08:54:57 2001
Date: Sat, 26 May 2001 19:34:03 EDT
From: EarlJoseph@aol.com
To: Hpc@idc.com
Subject: The IDC HPC Balanced Rating Benchmark


    [ Part 1.1, Text/PLAIN  67 lines. ]
    [ Unable to print this part. ]

Hi,

We would like to thank the numerous HPC users and vendors for their
excellent
comments, ideas and suggestions for improving the new User Forum HPC
benchmark initiatives. We did our best to incorporate everyone's inputs
into
the IDC balanced rating benchmark or as input to the more extensive user
benchmark tool that SDSC is taking the lead on developing.

Attached is the updated bulletin that describes our plan for the IDC
Balanced
Rating benchmark.

Some of the improvements in the new version include:

1) SUGGESTION -- Include a more complete and robust actual memory system
measurement that takes into account more of the attributes of the memory
subsystem.
SOLUTION - Added the STREAM Machine Balance benchmark metric.

2) SUGGESTION - Bisection bandwidth needs to be carefully defined and
only
shows part of the story, you need to also look at total system bandwidth.
SOLUTION - Added total system bandwidth and will add complete definitions
for
each metric as they are finalized.

3) SUGGESTION - Reduce the need for vendors to run LINPACK and SPEC on
large
systems.
SECOND SUGGESTION -- SPEC works well to show the strength of one or a few
processors, but is less relevant on very large systems.
SOLUTION - Use LINPACK and SPEC on only one processor, then multiply by
the
number of processors.

4) SUGGESTION - Clarify the difference between the two benchmark
activities
and better explain the reasons behind the IDC Balanced Rating.
SOLUTION - Added a better description of the two approaches and explained
that the IDC Balanced Rating objective is to be put into place with a few
months, so it needs to be based initially on currently available data.

5) SUGGESTION - We received many outstanding suggestions for approaches
to
create new HPC benchmarks, and suggestions for using previous benchmarks.
We
also received many examples of benchmarks that users have tested and have
proven to work well for specific user organizations.
SOLUTION - These will be carefully reviewed and explored as part of the
HPC
user benchmark tool being developed by SDSC. Users requested that 90% (or
more) of our benchmark resources are applied to the SDSC user tool --
your
inputs are greatly appreciated!

6) SUGGESTION -- We also received a number of very different suggestions
on
how to consolidate the different metrics, alternative approaches for
normalizing the data, and different ways to show the results.
SOLUTION - We will explore different approaches as we evolve the ratings
and
we will include many of the suggestions in the more extensive user
benchmark
tool.

7) SUGGESTION - Many users said that the approach is a good first step
and
that we should get to work and implement it ASAP.
SOLUTION - We plan to publish the first set of rating tables by
midsummer.

Thank you,

Earl Joseph
HPC User Forum Executive Director

P.S.  Please plan on joining us at the fall HPC User Forum meeting in
Princeton, New Jersey on Wednesday, September 12, going until midday
Friday,
September 14.  The focus will be on the life sciences, genome, biology,
pharmaceutical, and chemistry industries; the new benchmark approaches;
and
many other interesting HPC user topics.


    [ Part 2, Application/PDF  68KB. ]
    [ Unable to print this part. ]

