The Complete IDC Intelligence Solution

Download Report

Transcript The Complete IDC Intelligence Solution

Welcome To The 31th
HPC User Forum
Meeting
October 16, 2008
Special Thanks To:
Imperial College London
Simon Burbidge
Sue Pritchett
Thank You To Our Sponsors:
Altair
Fujitsu
HP
IBM
Panasas
Steve Finn And Steve Conway
HPC User Forum Update
Introduction: HPC User Forum Mission
Assist HPC users in solving their ongoing
computing, technical and business problems
A forum for exchanging information, identifying
areas of common interest, and developing unified
positions on requirements
Provide members with a continual supply of
information on:
 Uses of high end computers, high end best
practices, market dynamics, computer systems
and tools, vendor activities and strategies
Provide members with a channel to present their
achievements and requirements to outside
interested parties
Introduction: HPC User Forum Mission
European Meeting Goal:
Maintain a dialogue between US and
European HPC users/buyers
• Recognizing that European market dynamics
are not identical to U.S. market dynamics
• And that dynamics in Europe vary from
country to country and region to region
Introduction:
HPC User Forum Steering Committee
Steve Finn
BAE Systems
Chairman
Paul Buerger
Ohio
Supercomputer Center
Paul Muzio
City University of New
York
Sharan Kalwani
General Motors
Corporation
Vice Chairman
Steve Conway
IDC Research
Vice President
Michael Resch
HRLS, University of
Stuttgart
Jack Collins
National Cancer Institute
Vince Scarafino
Industry Expert
James Kasdorf
Pittsburgh Supercomputing
Center
Suresh Shukla
The Boeing Company
Earl Joseph
IDC, Executive Director
Vijay Agarwala
Penn State University
Alex Akkerman
Ford Motor Company
Doug Ball
The Boeing Company
Doug Kothe
Oak Ridge National
Laboratory
Robert Singleterry
NASA/Langley
Allan Snavely
San Diego
Supercomputer Center
Important Dates For Your Calendar
HPC User Forum Meetings:
 October 13 and 14, 2008
– HLRS/University of Stuttgart
 October 16, 2008
– Imperial College London
 April 20 to 22, 2009
– The Hotel Roanoke & Conference Center,
Roanoke, VA
 September 8 to 10, 2009
– Omni Interlocken Resort, Broomfield, CO
SC08 in Austin, Texas, November 15 to 21, 2008
ISC08, Hamburg, June 23 to 26, 2009
IDC HPC
Market Update
IDC’s HPC Team
Earl Joseph
IDC HPC research studies, HPC
User Forum, and strategic
consulting
Steve Conway
HPC User Forum, consulting,
primary user research and events
Richard Walsh
In-depth technical analysis,
special studies, processor trends,
and data center issues
Jie Wu
HPC research specialist, census
and forecasts, China research,
interconnects and grids
Lloyd Cohen
Director Worldwide Market
Analysis, data analysis,
workstations
Beth Throckmorton
Government account
support, special projects
Charlie Hayes
Government HPC issues,
DOE, and special studies
Mary Rolph
Conference planning and
logistics
©2007 IDC 10
Top Trends in HPC
HPC continues to show strong growth 10% this Q
 19% yearly growth over the last 4 years
 We are forecasting 9.2% growth for the next 5 years
Blades are making inroads into all segments
Major challenges for datacenters:
 Power, cooling, real estate, system management
 Storage and data management continue to grow in
importance
Software hurdles will rise to the top for most users
 Driven heavily by multi-core processors and hybrid
systems
Why Has Technical Computing Grown
So Quickly?
1. Price and price/peak performance of clusters has
redefined the cost of technical computing
– >6x better than RISC, >70x better than vectors
2. At the same time, “live” science and “live” engineering
costs have escalated
– Plus time-to-solution is months faster with simulations
3. Global competitiveness is driving R&D and better
product designs
4. At the same time, x86 performance on technical
applications is weak
– Driving buyers to purchase a much larger number of
processors
5. New materials and approaches require rewriting the
“books and tables” which takes years – making
simulations a faster solution
2007 HPC Market Size By Competitive Segments
HPC
Servers
$10B
Workgroup
(under $100K)
$2.4B
Supercomputers
(Over $500K)
$2.7B
Divisional
($250K - $500K)
$1.6B
Departmental
($250K - $100K)
$3.4B
Vendor HPC Market Shares In 2Q08:
All HPC Segments
NEC Other
Bull 0.6%
10.2%
Dawning 1.1%
0.1%
SGI
2.1%
HP
36.7%
Sun
5.4%
Dell
15.7%
Cray
1.3%
IBM
26.8%
HPC Revenue by Processor Type
Total HPC Revenue by Processor Type
100%
80%
x86-64
x86-32
Vector
RISC
Prop
EPIC
60%
40%
20%
0%
2000
2001
2002
2003
2004
2005
2006
2007
Why Is Commodity Hot? .. Price!
HPC All Servers Processor Summary, 2007
Average
System
CPUs /System ASP($K)
$/CPU
CPUs
/$M
X86
18
$39
$2,151
465
RISC
9
$70
$7,869
127
EPIC
7
$65
$8,966
112
Vector
11
$605
$54,177
18
HPC
Cluster Update
HPC Cluster Revenue Growth Rates
Growth Has Averaged Over 74%/yr Since 2002
180%
160%
140%
120%
100%
80%
60%
40%
20%
0%
2001
2002
2003
2004
2005
2006
2007
Growth In HPC Clusters
90%
Cluster
80%
70%
60%
50%
40%
30%
20%
10%
0%
Non-Cluster
Cluster Revenue Share by Processor
Cluster Revenue Share by
Processor Type
100%
90%
80%
70%
x86
EPIC
RISC
60%
50%
40%
30%
20%
10%
0%
2001
2002
2003
2004
2005
2006
2007
HPC Cluster Processor Shipments
HPC cluster processors are now shipping at a rate of
over 2.8 million a year
 Average yearly growth has been 44%
New Competitive Segment
2003
2004
2005
2006
2007
CAGR
157,384
178,982
277,224
375,553
535,338
36%
96,929
241,254
367,002
516,137
610,142
58%
cs3-Departmental
189,091
273,344
968,013
1,545,991
1,442,150
66%
cs4-Workgroup
209,980
406,452
229,601
189,048
224,974
2%
Grand Total
653,384
1,100,032
1,841,840
2,626,728
2,812,604
44%
cs1-Supercomputer
cs2-Divisional
HPC Market
Forecasts
HPC Forecast: Strong Growth Over
Next Five Years ($ Millions)
2007
2012
CAGR
Supercomputer
$2,682
$3,512
5.5%
Technical Divisional
$1,610
$3,092
13.9%
Technical Departmental
$3,384
$5,763
11.2%
Technical Workgroup
$2,400
$3,193
5.9%
Total
$10,076
$15,617
9.2%
Source: IDC, 2008
HPC Application Forecast, 2007 - 2012
HPC Application Segment
2007
2012
5-yr CAGR
Bio-Sciences
$1,558,368
$2,454,715
9.5%
CAE
$1,268,038
$2,321,580
12.9%
Chemical Engineering
$259,506
$367,561
7.2%
DCC & Distribution
$585,391
$1,081,443
13.1%
Economics/Financial
$305,325
$510,675
10.8%
EDA
$717,481
$931,569
5.4%
Geosciences and Geo-engineering
$589,343
$1,001,070
11.2%
Mechanical Design and Drafting
$139,851
$262,105
13.4%
Defense
$917,577
$1,413,607
9.0%
Government Lab
$1,376,058
$1,657,796
3.8%
University/Academic
$1,858,705
$2,764,522
8.3%
Weather
$399,228
$732,349
12.9%
Other
$101,559
$118,179
3.1%
$10,076,430
$15,617,170
9.2%
Total Revenue
Summary Thoughts
Major Customer Pain Points
Clusters are still hard to use and manage





System management & growing cluster complexity
Power, cooling and floor space are major issues
Third party software costs
Weak interconnect performance at all levels
Applications & programming — Hard to scale
beyond a node
 RAS is a growing issue
 Storage and data management are becoming new
bottle necks
 Lack of support for heterogeneous environment and
accelerators
Major Customer Pain Points
Software is becoming the #1 roadblock
 Better management software is needed
– HPC clusters are hard to setup and operate
– New buyers – require “ease-of-everything”
 Parallel software is lacking for most users
– Many applications will need a major redesign
– Multi-core will cause many issues to “hit-the-wall”
Software – ISV Scaling Limitations
TABLE 20
Typical Number of Processors the ISV Applications Use for Single Jobs
CPU Range
Number of Applications
Percent
1
19
24.4%
2-8
25
32.1%
9-32
20
25.6%
33-128
9
11.5%
129-1024
4
5.1%
Unlimited
1
1.3%
78
100.0%
Total:
New Challenges Affecting IT Datacenters
The increase in CPUs and server units is creating
significant IT challenges in:
 Managing complexity
– How to best manage a complex cluster
– How to install/setup a new cluster without having to
buy a large number of separate pieces
 Power/cooling and Space
 Application scaling and hardware utilization
– How to deliver strong performance to users on YOUR
applications
– How to make optimal use of new processor and
system designs
Questions?
Please email:
[email protected]
Or check out:
www.hpcuserforum.com
Agenda: Thursday Morning
9:00
9:15
9:30
10:00
10:30
10:45
11:15
11:30
12:00
12:15
Imperial College Welcome
HPC User Forum Welcome/Introductions, Steve
Finn and Steve Conway
Jamil Appa, BAE Systems, HPC in Aerospace
Doug Ball, Boeing, HPC Trends in Aerospace
IBM Technology Update
Isabella Weger, ECMWF, HPC and Weather
Prediction
Break
Gerard Gorman, Imperial College, Software
Engineering & Support
Dr. Frank Baetke, HP, HP's Scalable Computing
Strategy
Lunch
Lunch Logistics
Welcome
To The 31th
HPC User Forum
Meeting
Agenda: Thursday Afternoon
13:15
13:30
14:00
14:30
14:45
15:15
15:45
16:15
16:30
17:00
17:15
17:45
18:00
Panasas Technology Update
Terry Hewitt, EDS, Automotive Work for Rolls-Royce
Fujitsu R&D Technology Update, Motoi Okuda
Andrew Jones, NAG, HPC Trends and Issues
Peter Haynes, Imperial College, Materials and Physics
Vince Scarafino, HPC User Forum HPC Technology Panel
Results
Irene Qualters, SGI, Industrial Strength Linux
Break
Mark Parsons, EPCC Site Update and HECToR Program
Bill Butcher, Altair Engineering, Update
HPC trends at NASA, Robert Singleterry
Wrap up and plans for future HPC User Forum meetings, Steve
Conway and Steve Finn
Guided Tour of Imperial College
Important Dates For Your Calendar
HPC User Forum Meetings:
 October 13 and 14
– In Stuttgart, Germany
 October 16
– At the Imperial College, London
 April 20 to 22, 2009
– The Hotel Roanoke & Conference Center,
Roanoke, VA
 September 8 to 10, 2009
– Omni Interlocken Resort, Broomfield, CO
Supercomputing08 in Austin, Texas, November 17 to
21, 2007
ISC08, Hamburg, June 23 to 26, 2009
Thank You
th
For Attending The 31
HPC User Forum
Meeting
Questions?
Please email:
[email protected]
Or check out:
www.hpcuserforum.com