Supercomputing at the University of Arkansas

Download Report

Transcript Supercomputing at the University of Arkansas

Supercomputing
at the University of Arkansas
Amy Apon, Ph.D.
Oklahoma Supercomputing Symposium
October 5, 2005
Outline of Talk
• What is the status of supercomputing at the
University of Arkansas?
– Also in relation to other institutions
• Why do supercomputing at our institution?
• How did we get this far?
– Acquiring Red Diamond
• What comes next?
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
The Status of Supercomputing at the
University of Arkansas
• Red Diamond supercomputer
– Number 379 on the Top 500 list, June, 2005
– 128 node (256 processor)
– 1.349 TFlops (trillion floating point
operations/sec)
– First supercomputer in Arkansas
– $213K from NSF MRI grant, 08/04, Apon PI
• Co-PIs Pulay, Fu, Bellaiche, Deaton, Selvam, Mattioli,
Thompsons, Johnston
– Substantial match from the University
– Substantial gift from Dell
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Significance of Red Diamond
• Places the University of Arkansas among about 40 peer
academic institutions, public and private, holding this
quality of resource
– As measured by the Top 500 list of the fastest computers in the
world, released every June and November
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Other Academic Supercomputing Sites
(partial list only – the map shows the Members of the Coalition for Academic Scientific
Computation)
… also include sites in Kansas, Nebraska, Missouri, Alabama, South Carolina,
Minnesota, Wisconsin, Maryland, Delaware, and Oregon
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Ranking in Top 500 list, academics in the U.S. by state
(Source: http://www.top500.org)
State
Jun '05 Rank (29)
Virginia
14
Illinois
20,38,47,48
Penn
33,68,395
California
37,43,63,66,71,108,162
Utah
53
*Oklahoma 54
Mass
59
Texas
74,427,441
New York
117,200,242,326
Tennessee 129
*Alaska
136,342
*Louisiana 147
Maryland
166
Arizona
249
Minnesota
307
*Mississippi 367
*Arkansas 379
*Kentucky 373,468
Florida
394,498
Indiana
Georgia
Ohio
*Nebraska
New Jersey
Delaware
North Carolina
Michigan
Nov ‘04 Rank (34)
7
10, 22
34, 222, 480, 484
25, 31, 37, 62, 444
367
Jun ‘04 Rank (39)
Nov ’03 (41)
5, 15, 406
25, 295, 297
23, 43, 44, 276, 387
141, 230, 313
3
4, 35, 195, 259
12, 142, 154, 156
63, 137, 171
177, 248
253
76, 203
82
58, 198, 270, 411
95, 147
413
56, 154
65
26, 198, 260, 355
38, 68
199
72, 102, 355
30
159
359
211
118
442
160
53
348
221, 296
334
376
456
477
221
190
218
233
274, 344
292
333
453
117, 302
92
114
473
40, 248, 449
123, 152, 194, 308
295
166
180
317
173
218
183, 395
*EPSCOR States
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Why do supercomputing?
An opportunity for funding!
Federal Research Funding Rank by Year
(Source: http://thecenter.ufl.edu/research_data.html)
60
80
U. Ken, 302/500 in 11/95
Rank
100
Miss St, 359/500 in 11/96
LSU, 400/500 in 11/95
120
UN-L, 107/500 in 6/02
U. OK, 197/500 in 11/02
140
Uark, Top 500 in 6/05
160
180
1998
1999
2000
2001
2002
Year
Continuing supercomputing capability and federal funding levels are correlated!!
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Federal Funding Directions
• President's Information Technology Advisory
Panel encourages the growth of
"computational science," or the use of
computers to complement experiments and
theoretical research.
• The panel calls for more federal spending on
supercomputing (Source: Chronicle Daily
News 04-15-2005)
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Benefits to Campus Users of a
Supercomputing Center
• Over time, we can refocus existing resources to a
high-quality centrally-managed facility – avoids
duplication of resources on campus
– Eliminate need for departmental and research group
clusters
– Reduce cost for software licenses, startup funds
– Be a focus for supercomputing activity on campus
• Can be an attraction in recruiting top faculty and
Ph.D. students
• We have infrastructure to support a larger system
– We can work on larger problems
• We become more competitive in grant applications
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
How did we get this far?
Acquiring Red Diamond
• MRI – Major Research Instrumentation grant
from the National Science Foundation
• Only three MRI proposals can be submitted
from an institution
• The first year we tried we did not make the
campus cut
– Amazingly, not everyone believes that we need
supercomputing!!
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
NSF MRI Proposal
• Funding is granted based on the quality of
research
• Geeky computer science types need not apply
– Just evaluating the benefits of a high-performance
network, multi-core processors, even compiler
optimizations is probably not enough
• Need to demonstrate the need for computing
power for science and engineering research
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Computational Research at the
University of Arkansas
• Development of middleware tools (Array
Files) for managing, locating, and indexing
data for large-scale out-of-core computational
chemistry applications.
– This research is inherently interdisciplinary and
results are applicable to many other projects in
this proposal.
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Computational Research at the
University of Arkansas
• Computational chemistry in two major
areas, including:
– The development of a parallel Coupled-Cluster
Singles and Doubles (CC-SD) code which will run
efficiently on a distributed memory system, and
– The development of an efficient parallel version of
our Fourier Transform Coulomb (FTC) method for
large-scale density functional calculations.
New formulas for new drugs!
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Computational Research at the
University of Arkansas
• Materials science, using a state-of-art firstprinciples density-functional theory (DFT)
computational approach.
• The research includes the study of novel
nanostructure materials that possess unusual
properties of technological importance, in particular:
– Nanostructures of ferroelectric (FE) and piezoelectric
oxides which exhibit many electrical, mechanical, and
structural properties that are not shared by other materials
Semiconductor nanomaterials!
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Computational Research at the
University of Arkansas
• DNA sequence design and analysis of
large sets of sequences for biotechnology and
nanotechnology applications.
– The computing equipment proposed here will
accelerate the search for large sets of noncrosshybridizing DNA sequences.
– These sequences will form a large library for use
in DNA computations or nanotechnology
New ways to store huge amounts of information!
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Computational Research at the
University of Arkansas
• Multiscale modeling, including:
– The computation of electronic and optical
properties of nanodevices, the investigation of
the issues in multiscale modeling
– Multiscale modeling of crack propagation in alloys
and metals
Models of tornados!
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Computational Research at the
University of Arkansas
Other projects
• Models of volcanos
• Next generation networking
• Geospatial databases
• Data mining
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Observations from an MRI panel
• Base the equipment request on research
drivers
• Request an appropriately sized resource
– For the problem
– And with appropriate subcomponents
• An error in the resource description is more
easily forgiven than perceived deficiencies in
the research, but either can kill the proposal
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
What Comes Next?
UofA Current Challenges
• Education of researchers, faculty & students
– Some of our best scientists still need education on
how to use a distributed memory parallel
computer, including MPI, compiler tools
• System administration
– Don’t underestimate the amount of time to
administrate a large system – it does not scale
linearly!!
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
What Comes Next?
UofA Current Challenges
• Supercomputing operations
– Keep the AC on
– Power
– UPS
– Space
• Usage policies, administration
– How do you incorporate usage from new faculty?
– How do you partition usage fairly?
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
What Comes Next?
UofA Current Challenges
• Future grant applications
– Lifespan of a supercomputer is about three years!
• Funding models for on-going operations
– How will basic systems administration and project
director be funded?
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
What Comes Next?
We are
• Increasing campus-level support for HPC
• Expanding our computational science and
engineering activities
– New researchers and domain areas
• Collaborating (via grid computing)
– Within the state
– Regionally (OU, GPN, SURA)
• Expanding access to National Lambda Rail
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005
Questions?
Contact information:
http://hpc.uark.edu
http://comp.uark.edu/~aapon
[email protected]
Amy Apon, Ph.D. ● Oklahoma Supercomputing Symposium ● October 5, 2005