JIID2006 - L'Irfu, Institut de Recherche sur les lois

Download Report

Transcript JIID2006 - L'Irfu, Institut de Recherche sur les lois

Numerical Simulations in Astrophysics
The COAST Project
Daniel Pomarède
CEA/DAPNIA/SEDI/LILAS
The COAST Project at DAPNIA
• The COAST « Computational Astrophysics » Project is dedicated to
the study of structures formation in the Universe:
–
–
–
–
large-scale cosmological structures and galaxy formation
turbulences in molecular clouds and star formation
stellar MHD
protoplanetary systems
• The project relies on numerical simulations performed on high
performances, massively parallel mainframes and on software tools
useful to the development, optimization, validation of the numerical
simulation codes, the treatment and the exploitation of the results:
–
–
–
–
visualization
numerical algorithms
databases
code management
The COAST Project
2
The COAST Project at DAPNIA
• This transverse DAPNIA project involves :
– the SAp “Service d’Astrophysique”
• 7 FTE permanent positions
• 6 PhDs and 2 postdocs
– the SEDI “Service d’Electronique, des
Detecteurs et de l’Informatique”
• 3 FTE software engineers from the LILAS
Laboratory
• Master students
The COAST Project
3
The COAST Project members
• Astrophysics :
–
–
–
–
–
–
–
E. Audit : ISM, star formation
F. Bournaud : galactic dynamics
S. Brun : solar modeling
S. Charnoz : planet formation
S. Fromang : MHD turbulences
F. Masset : planetary migration
R. Teyssier : cosmology, galaxy formation
• Software developments :
–
–
–
–
V. Gautard : numerical algorithms
J.P. Le Fèvre : databases
D. Pomarède : visualization
B. Thooris : data format, code management
The COAST Project
4
The COAST Project numerical simulation codes
• four simulation codes are developed at DAPNIA :
– RAMSES, a hybrid N-body and hydrodynamical AMR code,
simulates the dark matter and the baryon gas
– HERACLES a radiation hydrodynamics code to study
turbulences in interstellar molecular clouds
– ASH (in collaboration with U. of Colorado), dedicated to the
study of stellar MHD
– JUPITER, a multi-resolution code used in the study of
protoplanetary disks formation
• these codes are written in F90 or C, and parallelized
with MPI
• they rely on numerical algorithms : equation solvers
(Godunov, Riemann), adaptive mesh resolution
techniques, cpu-load balancing (Peano-Hilbert space
filling curves), …
The COAST Project
5
The COAST Project Computing Resources
• Local DAPNIA resources, used for development and post-treatment:
– funded by DAPNIA (120k€):
• DAPHPC :
– a 96-cores 2.6 GHz opteron cluster (24 nodes with 8Gb
memory, with an Infiniband interface).
– 2/3 of computing time allocated to COAST
– funded by universities :
• SAPPEM : a 8-processors xeon platform with 32 Gb of memory
– funded by ANR (Agence Nationale de la Recherche) :
• 4 Visualization stations with 16 to 32 Gb RAM, ~1Tb disk, 4
processors, 1Gb memory graphics cards, 30 inches screens
• CEA resources at CCRT (CEA National Supercomputing Center) for
massive simulations : 4 Mhrs in 2007
– Platine, ranking 12th in the TOP500 world supercomputer list (June
2007) : 7456 Itanium cores, total 23 Tb memory, 47.7 Teraflops
– Tantale : HP/Linux AMD Opteron cluster with 552 cores
The COAST Project
6
The COAST Project Computing Resources
• Other resources for massive simulations : 2 Mhrs for 2007
– DEISA Extreme Computing Initiative
– MareNostrum at the Barcelona Supercomputing Center, ranking 9th
in the TOP500 world supercomputer list (June 2007) : 10240 IBM
PowerPC 2.3 GHz cores with 94.2 Teraflops, 20Tb of main memory
The COAST Project
7
The COAST Project software development pool
• Data handling
– migration to the HDF5 Hierarchical Data Format developed at NCSA
(National Center for Supercomputing Applications, USA) for the
HERACLES code
– massively parallel I/O schemes
• Numerical algorithms
–
–
–
–
–
development of a multiple-grid scheme for the HERACLES code
Radiation transfer/photo-ionization scheme
MHD/Godunov schemes
Poisson solver multiple grid and AMR
Load-balancing schemes
• Databases development :
– the HORIZON Virtual Observatory, a relational database to store the
results of the “galaxy formation” simulations :
• halos, sub-halos, galaxy catalogs
• merger trees
– ODALISC (Opacity Database for Astrophysics, Lasers experiments and
Inertial Fusion Science), provides a database of opacities and equations
of state useful to the astrophysics and plasma/laser interaction
communities
The COAST Project
8
The COAST Project software development pool
• Visualization : development of SDvision
SDvision
the Saclay/DAPNIA Visualization Interface
–
–
–
–
IDL Object Graphics framework
interactive 3D navigation and analysis
visualization of RAMSES, HERACLES, JUPITER, ASH data
visualization of complex scenes with scalar fields (volume
projection, 3D isosurface, slices), vector fields (streamlines)
and particle clouds
The COAST Project
10
• Interactive visualization of
huge datasets on desktops
• example of MareNostrum
output #97 :
 100 Gb of data
 2048 processors
• interactive selection of
subvolume (10% in each
direction)
• data extraction through the
Peano-Hilbert space filling
curve
• projection of the AMR up to
Level 13 in a 8003 Cartesian
grid (4 Gb of memory) suitable
for interactive navigation
• HERACLES 1200x1200x1200
256-processors simulation of
turbulences in the interstellar
medium (size ~20pc)
• Max intensity projection of the
density field
Navigation in the RAMSES AMR : synchroneous spatial and resolution zooms
The COAST Project
15
Visualization of temporal evolutions : galaxy mergers
The COAST Project
16
Highlights of recent COAST milestones
• The HORIZON Grand Challenge Simulation at CEA/CCRT on Platine :



the largest ever N-body cosmological simulation was performed with RAMSES
6144 cores, 18 Tb RAM used for 2 months to simulate 70 billions particles
used to simulate future weak-lensing surveys like DUNE or LSST
• The HORIZON “galaxy formation” simulation at MareNostrum :



10243 dark matter particles, 4 billions AMR cells, box size 50 Mpc/h, resolution in space 2 kpc
2048 processors for computing, 64 processors dedicated to I/O, 3 weeks of computations so far,
down to z=1.9, 20 Tb of data generated and stored
from large scale filaments to galactic discs :
The COAST Project
17
Highlight of a few recent publications
• about 40 refereed publications for the 2006-2007 years. A very few examples :
• in Astronomy & Astrophysics :
– “On the role of meridional flows in flux transport dynamo models”, L. Jouve
and A.S. Brun, A&A 474 (2007) 239
– “On the structure of the turbulent interstellar atomic hydrogen”, P. Hennebelle
and E. Audit, A&A 465 (2007) 431
– “Simulating planet migration in globally evolving disks”, A. Crida, A.
Morbidelli, and F. Masset, A&A 461 (2007) 1173
– “A high order Godunov scheme with constrained transport and adaptive
mesh refinement for astrophysical magnetohydrodynamics”, S. Fromang, P.
Hennebelle, R. Teyssier, A&A 457 (2006) 371
• in The Astrophysical Journal :
– “Simulations of turbulent convection in rotating young solarlike stars:
differential rotation and meridional circulation”, J. Ballot, A.S. Brun, and S.
Turck-Chieze, ApJ 669 (2007) 1190
– “On the migration of protogiant solid cores”, F. Masset, G. D’Angelo, and W.
Kley, ApJ 652 (2006) 730
– “Disk surface density transitions as protoplanet traps”, F. Masset, A.
Morbidelli, and A. Crida, ApJ 642 (2006) 478
• in Journal of Computational Physics :
– “Kinematic dynamos using constrained transport with high order Godunov
schemes and adaptive mesh refinement”, R. Teyssier, S. Fromang, and E.
Dormy, J. Comp. Phys. 218 (2006) 44
The COAST Project
18
Publications in conferences
• Organisation of the ASTRONUM-2007 “Numerical Modeling of Space
Plasma Flows” in Paris, June 11-15, 2007, 80 participants
– 5 presentations by COAST members
• Supercomputing Conferences :
– SC06 (Tampa), ISC07 (Dresden), SC07 (Reno)
• Visualization Conferences :
– CGIV07 Computer Graphics, Imaging and Visualization, Bangkok,
august 2007, IEEE Computer Society
– International Workshop on Visualization of High-resolution 3D Turbulent
Flows, Ecole Normale Supérieure, Paris, june 2007
• Computational Physics
– CCP2006 (Gyeongju, South Korea), CCP2007 (Bruxelles)
– ASTRONUM-2006 (1st edition in Palm Springs)
• Modeling and Simulation
– MSO2006, Botswana, september 2006
– EUROSIM2007, Ljubljana, september 2007
• Software
– ADASS XVI (Tucson, 2006)
• Astrophysics :
– “Protostars and planets V”, IAU Symposia, …
The COAST Project
19
External fundings for the COAST Project
• Successful applications to ANR Research National Agency :
– HORIZON
• the objective is to federate numerical simulations activities with a
program focused on galaxy and large scale structure formation
• budget = 500 k€
• DAPNIA leadership
– SYNERGHY
• a cross-disciplinary project focusing on simulations in astrophysics, hot
dense matter and inertial confinement fusion
• budget = 600 k€
• DAPNIA leadership
– MAGNET
• development of MHD numerical codes, and study of generation and
structure of magnetic fields in astrophysics
• budget = 400 k€
The COAST Project
20
Perspectives for the COAST Project
• Computational astrophysics has a bright future, lying on the ever increasing
performances of massively parallel mainframes
• Recipe for success : synergy between astrophysicists, software developers,
local computing resources, access to supercomputers
• Many similar projects and initiatives are competing, a few examples :
–
–
–
–
–
FLASH Center at U. of Chicago, organized in 6 groups, 41 members (Year 9 activities
report, 2006) : code (6), computational physics and validation (3), astrophysics (15),
computer science (7), visualization (3), basic science (7)
ASTROSIM European Network for Computational Astrophysics : 12 member
organizations
Applied Numerical Algorithms Group at Lawrence Berkeley, home of the Chombo and
ChomboVis Adaptive Mesh Refinement Library
Laboratory for Computational Science and Engineering, U. Minesotta
VIRGO consortium for Cosmological Supercomputer Simulations : 20-25 scientists,
heavy hardware resources at Durham (792 opteron cpus + 500 ultrasparc processors)
and Garching (816 power-4 processors)
• To keep pace in this competition, the COAST Project needs :
–
–
adequate local computing resources for developments and post-processing : typically
32 processors / permanent scientist => 256 processors (versus 64 currently)
additional strength in computer science (cluster management), data handling &
visualization, computational physics and validation
The COAST Project
21
Backup slides
RAMSES: parallel graded octree AMR
Code is freely available
+ MHD
Domain decomposition using space-filling curves
Fully Threaded Tree (Khokhlov 98)
Cartesian mesh refined on a cell by cell basis
octs: small grid of 8 cells, pointing towards
• 1 parent cell
• 6 neighboring parent cells
• 8 children octs
Coarse-fine boundaries: buffer zone 2-cell thick
Time integration using recursive sub-cycling
Parallel computing using the MPI library with a
domain decomposition based on the Peano-Hilbert
curve.
Algorithm inspired by TREE codes:
locally essential tree.
Tested and operational up to 6144 core.
Scaling depends on problem size and complexity.
The AMR “Octree” data structure of the RAMSES code
level 2
level 3
level 5
level 9
level 11
level 14
basic element of AMR
structure :
group of 2dim sibling
cells called “octs”
The RAMSES AMR
Level 9 to level 14
4.1107 cells
A formal resolution of
213 =8192 cells in each direction is
reached, amounting to a total of
81923=5.5 1011 cells
Thanks to this dynamic range, physical
processes at very different scales are
treated : large-scale gravitational
interaction to star formation in galaxies