Transcript Document

PRISM Coupling and I/O System
The 5th International Workshop on
Next Generation Climate Models for
Advanced High Performance
Computing Facilities
March 3-5, 2003
G. Berti, P. Bourcier, A. Caubel, D. Declat, M.-A. Foujols, J. Latour, S. Legutke,
J. Polcher, R. Redler, H. Ritzdorf, T. Schoenemeyer, S. Valcke and R. Vogelsang
Presentation outline
1 – PRISM General Presentation
2 - PRISM Goals
3 - PRISM Scientific and Technical Standards
4 - PRISM Coupling and I/O System
5 - PRISM first coupler: Oasis 3.0
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 2
1 – PRISM General Presentation
- PRISM: PRogram for Integrated Earth System Modelling
- A European project, started December 2001,
funded by the European Commission (4.8 M€)
- Coordinators:
- Guy Brasseur (MPI, Hamburg)
- Gerbrand Komen (KNMI, Amsterdam)
- PRISM Director: Reinhard Budich (MPI)
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 3
1 – PRISM General Presentation
=> 22 partners: leading climate research institutes
and computer vendors
•
•
•
•
•
•
•
•
•
•
MPG-IMET, Germany
KNMI, Netherlands
MPI-MAD,
Met-Office, UK
UREADMY, UK
IPSL, France
Météo-France, France
CERFACS, France
DMI, Denmark
SHMI, Sweden
•
•
•
•
•
•
•
•
•
•
•
NERSC, Norway
ETH Zurich, Switzerland
ING, Italy
MPI-BGC, Germany
PIK, Germany
ECMWF, Europe
UCL-ASTR, Belgium
NEC Europe
FECIT/Fujitsu
SGI Europe
SUN Europe
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 4
2 - PRISM Goals
Help climate modellers spend more time on science:
Provide software infrastructure
to easily assemble Earth system coupled models based on
existing state-of-art European components models
launch/monitor complex/ensembles Earth system
simulations
Define and promote technical and scientific standards for
Earth System modelling
Undertake a pilot infrastructure project toward, on the
longer term, the establishment of a European Climate and
Earth System Modeling Supercomputer Facility.
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 5
2 - PRISM Goals
Atmosphere:
Atmospheric
Chemistry:
Météo-France (ARPEGE), MPGIMET (ECHAM), IPSL (LMDZ),
MetOffice (Unified Model),
UREADMY, INGV
Land Surface:
IPSL (Orchidée), MetOffice,
MPG-IMET, UREADMY, MétéoFrance (ISBA)
MPG-IMET, UREADMY, IPSL,
MetOffice, Météo-France, KNMI
Coupler:
CERFACS, NEC,
CCRLE, FECIT, SGI,
MPI-MAD
MPI-BGC, IPSL, MPGIMET, MetOffice
SHMI, DMI, MetOffice
Sea Ice:
Ocean
Biogeochemistry:
Regional Climate:
Ocean:
NERSC, UCL-ASTR,
MetOffice, IPSL, MPG-IMET
UREADMY, MetOffice
(FOAM), MPI-M (HOPE),
IPSL (OPA/ORCA)
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 6
3 - PRISM scientific and technical standards
Scientific standards:
 Physical interfaces
 Global Earth System parameters
Technical standards:
 Architecture and User Interface
 Coupler and I/O
 Data and grid format
 Coding and quality
Interaction with other groups (ESMF, ESG/NOMADS, CF...)
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 7
4- PRISM Coupling and I/O System
Keynotes - PSMILe – Driver - Transformer
PRISM Coupling and I/O System =
Model Interface Library (PSMILe) + Driver+ Transformer
-Programming language: Fortran 90 and C
-All sofware produced are open source
–Coupling and I/O system use freely available open
source software products external libraries, hidden from
the model developer (MPICH, LAM-MPI, NetCDF,
libXML, mpp_io, SCRIP, …)
–Vendors may provide optimized versions for specific
independent software components (e.g. MPI, NetCDF)
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 8
4 - PRISM Coupling and I/O System
Keynotes - PSMILe – Driver - Transformer
•Insures data coupling exchanges between
any two component models either directly
or via or the transformer including
repartitioning with MPI1 or MPI2
•Insures data input/output from/to files
=>Switch between coupling exchanges
and input/output is:
• indicated by the user in a coupling
configuration file (XML)
• transparent for the component model
• managed automatically by the PSMILe
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
OB C
OB C
OB C
OB C
OB
C
OB C
T
T
C
O1
C
O1
C
O2
C
O2
OB C
OB C
file
OB C
Slide 9
4 - PRISM Coupling and I/O System
Keynotes - PSMILe – Driver - Transformer
•Concise and extendable interface
•Coupling data produced by a source model can be
consumed by more than one target model and at a
different frequency.
•Coupling data may be only partially consumed by the
target model (extraction of subspaces, hyper-slabs or
indexed grid points).
•Local operation on the coupling data can performed
before exchange.
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 10
4 - PRISM Coupling and I/O System
Keynotes - PSMILe – Driver - Transformer
Different classes of primitives insuring:

The Startup phase
•
Initialization; prism_init(…), …
•
Termination of definition; prism_enddef(…)

The Definition phase
• Definition of Grid; prism_def_grid(…), prism_set_mask(…), …
• Definition of Data; prism_def_var(…)

The Transfer of fields
• Exchange or I/O of Data; prism_put(…), prism_get(…)

The Termination phase
• Termination of PRISM application; prism_terminate(…)
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 11
4 - PRISM Coupling and I/O System
Keynotes - PSMILe – Driver - Transformer
• Launching of component models:
- static: all at beginning of simulation (MPMD or SPMD)
• Distribution of global information:
- initial date, calendar, coupling parameters, etc.
• Component model monitoring:
- centralises all logging messages sent by the models
during the simulation (timestep, restart saved, etc.)
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 12
4 - PRISM Coupling and I/O System
Keynotes - PSMILe – Driver - Transformer
Different transformations:
• Spatial interpolation: nearest-neighbour, bilinear, bicubic,
conservative remapping, …
• Other spatial transformations: flux correction, merging,
etc.
• General algebraic operations
• Combination of coupling data produced by one model may
have to be combined before the exchange.
On different types of fields
• 1D, 2D, 3D, 4D, scalar or vector, static or dynamic
structure
On different types of grids
• regular, gaussian, stretched, reduced, unstructured, etc.
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 13
Mi PMIOD
V1: out, metadata V1
V2: out, metadata V2
V3: in, metadata V3
Driver
user
user
SCC
V1 : Mi -> Mj, Tli, Tnlij
V2 : Mi -> Mj, Tij (+ V6)
V4 : Mj -> Mk
user
Mi
Mi SMIOC
V1 : cf SCC
V2 : cf SCC
V3 : in, fileV3, Tli
V2
Mj SMIOC
V1 : cf SCC
V4 : cf SCC
V2
V1
user
V4
V6
V5
Mk
fileV6
fileV5
Mk PMIOD
V4: in, metadata V4
V5: in, metadata V5
V7
V5
Deployment Phase
Mj
T
fileV3
Composition Phase
user
V7
V1
V3
Definition Phase
Mj PMIOD
V1: in, metadata V1
V4: out, metadata V4
user
Mk SMIOC
V4 : cf SCC
V5 : in, fileV5, TnlV5k
Mi: Model i
T: Transformer
PMIOD: Potential Model Input and Output Description
SMIOC: Specific Model Input and Output Config.
SCC: Specific Coupling Configuration
5 - PRISM First Coupler: Oasis 3.0
=> Available for beta testers
•New PRISM System model interface (PSMILe V.0)
• Using MPI1 or MPI2 and Conforming with final PRISM coupler interface.
• Direct communication between models with same grid and partitioning
• Modularity: prism_put and prism_get may be called at each model
time step; exchange is performed or not by PSMILe, depending on
user’s specifications in namcouple.
• Automatic time integration by PSMILe depending on user’s specification
• I/O and combined I/O and coupling functionalities
•New interpolations / interfacing with SCRIP library:
1st and 2nd order conservative remapping for all grids
Bilinear and bicubic interpolation for «logically-rectangular» grids
Bilinear and bicubic interpolation for reduced atmospheric grids
•F90 rewriting (dynamic memory allocation)
•NetCDF format for grid and restart auxiliary files
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 15
Contact us:
http://www.enes.org
[email protected]
[email protected]
[email protected]
[email protected]
[email protected]
[email protected]
Next Generation Climate Models
for Advanced High Performance Computing Facilities March 3-5, 2003
Slide 16