Transcript Document

The PRISM infrastructure
for Earth system models
Eric Guilyardi, CGAM/IPSL
and the PRISM Team
• Background and drivers
• PRISM project achievements
• The future
The PRISM infrastructure
Why a common software infrastructure ?
• Earth system modelling expertise widely distributed
Geographically
Thematically
The PRISM infrastructure
Why a common software infrastructure ?
• Earth system modelling expertise widely distributed
– Scientific motivation = facilitate sharing of scientific
expertise and of models
– Technical motivation = the technical challenges are
large compared with available effort
• Need to keep scientific diversity while increasing
efficiency – scientific and technical
• Need for concerted effort in view of initiatives
elsewhere:
– The Frontier Project, Japan
– The Earth System Modelling Framework, US
The PRISM infrastructure
PRISM concept
« Share Earth System Modelling software infrastructure
across community »
To:
•
•
•
•
share development, maintenance and support
aid performance on a variety of platforms
standardize model software environment
ease use of different climate model components
The PRISM infrastructure
Expected benefits
• high performance ESM software, developed by dedicated
IT experts, available to institutes/teams at low cost:
- helps scientists to focus on science
- helps key scientific diversity (survival of smallers groups)
• Easier to assemble ESMs based on community models
• shared infrastructure = increased scientific exchanges
• computer manufacturers inclined to contribute:
- efficiency (porting, optimisation) on variety of platforms
- next generation platforms optimized for ESM needs
- easier procurements and benchmarking
- reduced computing costs
The PRISM infrastructure
Software structure of an Earth System Model
Running environment
Coupling
infrastructure
Scientific
core
Supporting
software
The PRISM infrastructure
Share
(i.e. f90)
The long term view
Towards standard ESM support library(ies)
Tomorrow
Today
Earth System model
(Science + support +
environment)
Fortran Compiler
Hardware
The PRISM infrastructure
Climate science work
Modeller
Earth System model
(Science)
Standard support
Library (incl. Env.)
IT expert
Fortran Compiler
Hardware
The PRISM project
• Program for integrated Earth System
Modelling
– 22 partners
– 3 Years, from Dec 2001 - Nov 2004
– 5 Mill. € funding, FP5 of the EC (~80 py)
– Coordinators: G.Brasseur and G.Komen
The PRISM infrastructure
System specifications
The science :
The modelers/users:
- General principles
- Constraints from physical interfaces,…
- requirements
- beta testing
- feedback
PRISM
infrastructure
The technical developments:
- Coupler and I/O
- Compile/run environment
- GUI
- Visualisation and diagnostics
The community models
- Atmosphere
- Atmos. Chemistry
- Ocean
- Ocean biogeochemistry
- Sea-ice
- Land surface
-…
Let’s NOT re-invent the wheel !
The PRISM infrastructure
System specifications - the people
Reinhard Budich - MPI, Hamburg
Andrea Carril - INGV, Bologna
Mick Carter - Hadley Center, Exeter
Patrice Constanza - MPI/M&D, Hamburg
Jérome Cuny - UCL, Louvain-la-Neuve
Damien Declat - CERFACS, Toulouse
Ralf Döscher - SMHI, Stockholm
Thierry Fichefet - UCL, Louvain-la-Neuve
Marie-Alice Foujols - IPSL, Paris
Veronika Gayler - MPI/M&D, Hamburg
The PRISM infrastructure
* Chair
Eric Guilyardi* - CGAM, Reading and LSCE
Rosalyn Hatcher - Hadley Center, Exeter
Miles Kastowsky MPI/BCG, Iena
Luis Kornblueh - MPI, Hamburg
Claes Larsson - ECMWF, Reading
Stefanie Legutke - MPI/M&D, Hamburg
Corinne Le Quéré - MPI/BCG, Iena
Angelo Mangili - CSCS, Zurich
Anne de Montety - UCL, Louvain-la-Neuve
Serge Planton - Météo-France, Toulouse
Jan Polcher - LMD/IPSL, Paris
René Redler, NEC CCRLE, Sankt Augustin
Martin Stendel - DMI, Copenhagen
Sophie Valcke - CERFACS, Toulouse
Peter van Velthoven - KNMI, De Bilt
Reiner Vogelsang - SGI, Grasbrunn
Nils Wedi - ECMWF, Reading
PRISM achievements (so far):
• Software environment (the tool box):
1. a standard coupler and I/O software, OASIS3 (CERFACS) and OASIS4
2. a standard compiling environment (SCE) at the scripting level
3. a standard running environment (SRE) at the scripting level
4. a Graphical User Interface (GUI) to the SCE (PrepIFS, ECMWF)
5. a GUI to the SRE for monitoring the coupled model run (SMS, ECMWF)
6. standard diagnostic and visualisation tools
• Adaptation of community Earth System component models
(GCMs) and demonstration coupled configurations
• A well co-ordinated network of expertise
• Community buy-in and trust-building
The PRISM infrastructure
The PRISM shells
Outer shells
Standard Running Environment
Standard Compile Environ.
PSMILe (coupling and I/O)
Scientific core
Historic I/O
Inner shell
The PRISM infrastructure
Adapting Earth System Components to PRISM
+
SCE
PRISM Model Interface Library
Levels of
SRE Potential Model IO Description
adaptation
User
Interface
The PRISM infrastructure
PSMILe + PMIOD
Configuration management and deployment
SRE
SCE
Driver
User
Interface
Transf.
Binary executables
The PRISM infrastructure
disks
PRISM GUI remote functionality
Instrumented
sites
Config.
Deploy
PRISM
Repositories
(CSCS, MPI)
User
Internet
Driver
Driver
Driver
Transf.
Transf.
Transf.
Architecture A
Architect. B
Architect. C
The PRISM infrastructure
PrepIFS/SMS
Web services
Standard scripting environments
• Standard Compiling Environment
SCE
• Standard Runtime Environment SRE
The PRISM infrastructure
Data processing and visualisation
The PRISM infrastructure
Demonstration experiments
Platforms
Assembled
Coupled models
CGAM contribution
(Jeff Cole)
The PRISM infrastructure
Development coordinators
• The coupler and I/O - Sophie Valcke (CERFACS)
• The standard environments - Stephanie Legutke (MPI)
• The user interface and web services - Claes Larsson (ECMWF)
• Analysis and visualisation - Mick Carter (Hadley Centre)
• The assembled models - Stephanie Legutke (MPI)
• The demonstration experiments - Andrea Carril (INGV)
The PRISM infrastructure
Community buy-in
• Growing !
–
–
–
–
–
Workshops and seminars
15 pionneer models adapted (institutes involvement)
9 test super-computers intrumented
Models distributed under PRISM env. (ECHAM5, OPA 9.0)
Community programmes relying on PRISM framework
(ENSEMBLES, COSMOS, MERSEA, GMES, NERC,…)
• To go further:
– PRISM perspective: maintain and develop tool box
– Institute perspective: get timing and involvement in next
steps right
The PRISM infrastructure
Collaborations
Active collaborations:
• ESMF (supporting software, PMIOD, MOM4)
• FLUME (PRISM software)
• PCMDI (visualisation, PMIOD)
• CF group (CF names)
• NERC (BADC & CGAM) (meta-data, PMIOD)
• M&D, MPI (data)
• Earth Simulator (install PRISM system V.0)
Running environment
Coupling
infrastructure
Scientific
core
Supporting
software
PRISM has put Europe in the loop for community-wide
convergence on basic standards in ES modelling
The PRISM infrastructure
PRISM Final Project Meeting
The PRISM infrastructure
De Bilt, October 7-8, 2004
The future
• PRISM has delivered a tool box, a network of
expertise and demonstrations
• Community buy-in growing
• Key need for sustainability of
– tool box maintenance/development (new features)
– network of expertise
PRISM sustained initiative
Set-up meeting held in Paris Oct 27 2004
The PRISM infrastructure