NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH Introduction to the.

Download Report

Transcript NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH Introduction to the.

NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Introduction to the Earth System
Modeling Framework
Climate
Weather
NASA GSFC PSAS
NCAR/LANL CCSM
Data
Assimilation
MITgcm
GFDL FMS Suite
NCEP Forecast
NSIPP Seasonal Forecast
C. DeLuca/NCAR, J. Anderson/NCAR, V. Balaji/GFDL, B. Boville/NCAR, N. Collins/NCAR,
T. Craig/NCAR, C. Cruz/GSFC, A. da Silva/GSFC, R. Hallberg/GFDL, C. Hill/MIT, M. Iredell/NCEP,
R. Jacob/ANL, P. Jones/LANL, B. Kauffman/NCAR, J. Larson/ANL, J. Michalakes/NCAR,
E. Schwab/NCAR, S. Smithline/GFDL, Q. Stout/U Mich, M. Suarez/GSFC, A. Trayanov/GSFC,
S. Vasquez/NCAR, J. Wolfe/NCAR, W. Yang/NCEP, M. Young/NCEP and L. Zaslavsky/GSFC
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
•
•
•
•
•
ESMF Overview
ESMF Applications
Related Projects
ESMF Architecture
Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Background
NASA’s Earth Science Technology Office proposed the creation of an
Earth System Modeling Framework (ESMF) in the September 2000 NASA
Cooperative Agreement Notice:
“Increasing Interoperability and Performance of Grand Challenge
Applications in the Earth, Space, Life and Microgravity Sciences”
A large, interagency collaboration with roots in the Common Modeling
Infrastructure Working Group proposed three interlinked projects to develop
and deploy the ESMF, which were all funded:
Part I: Core ESMF Development (PI: Killeen, NCAR)
Part II: Modeling Applications (PI: Marshall, MIT)
Part III: Data Assimilation Applications (PI: da Silva, NASA GMAO)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Motivation
In climate research and NWP...
increased emphasis on detailed representation of individual physical processes;
requires many teams of specialists to contribute components to an overall
modeling system
In computing technology...
increase in hardware and software complexity in high-performance computing, as
we shift toward the use of scalable computing architectures
In software …
development of frameworks, such as FMS, GEMS, CCA and WRF, that encourage
software reuse and interoperability
The ESMF is a focused community effort to tame the complexity of models and
the computing environment. It leverages, unifies and extends existing software
frameworks, creating new opportunities for scientific contribution and collaboration.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Project Description
GOALS: To increase software reuse, interoperability, ease of use and performance portability in
climate, weather, and data assimilation applications
PRODUCTS:
• Core framework: Software for coupling geophysical components and utilities for building
components
• Applications: Deployment of the ESMF in 15 of the nation’s leading climate and weather
models, assembly of 8 new science-motivated applications
METRICS:
Reuse
Interoperability
Ease of Adoption
Performance
15 applications use
ESMF component
coupling services
and 3+ utilities
8 new applications
comprised of neverbefore coupled
components
2 codes adopt ESMF
with < 2% lines of
code changed, or
within 120 FTE-hours
No more than 10%
overhead in time to
solution, no
degradation in scaling
RESOURCES and TIMELINE: $9.8M over 3 years
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Accelerates Advances
in Earth System Science
Eliminates software barriers to collaboration among organizations
• Easy exchange of model components accelerates progress in NWP and climate
modeling
• Independently developed models and data assimilation methods can be combined and
tested
• Coupled model development becomes a truly distributed process
• Advances from smaller academic groups easily adopted by large modeling centers
Facilitates development of new interdisciplinary collaborations
• Simplifies extension of climate models to upper atmosphere
• Accelerates inclusion of advanced biogeochemical components into climate models
• Develops clear path for many other communities to use, improve, and extend climate
models
• Many new model components gain easy access to power of data assimilation
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Collaborators
NSF NCAR
Tim Killeen, PI
Jeff Anderson
Byron Boville
Nancy Collins
Cecelia DeLuca
Roberta Johnson
Al Kellie
John Michalakes
David Neckels
Earl Schwab
Robbie Staufer
Silverio Vasquez
Jon Wolfe
DOE ANL
Rob Jacob
Jay Larson
NOAA GFDL
Ants Leetmaa
V. Balaji
Robert Hallberg
Shep Smithline
NASA GMAO
Arlindo da Silva, PI
Michele Rienecker
Max Suarez
Atanas Trayanov
Christian Keppenne
Leonid Zaslavsky
Will Sawyer
Carlos Cruz
NOAA NCEP
Stephen Lord
Mark Iredell
Mike Young
Weiyu Yang
John Derber
MIT
John Marshall, PI
Chris Hill
DOE LANL
Phil Jones
University of Michigan
Quentin Stout
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
•
•
•
•
•
ESMF Overview
ESMF Applications
Related Projects
ESMF Architecture
Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Modeling Applications
SOURCE
GFDL
APPLICATION
FMS B-grid atmosphere at N45L18
FMS spectral atmosphere at T63L18
FMS MOM4 ocean model at 2°x2°xL40
FMS HIM isopycnal C-language ocean model at 1/6°x1/6°L22
MIT
MITgcm coupled atmosphere/ocean at 2.8°x2.8°, atmosphere L5,
ocean L15
MITgcm regional and global ocean at 15kmL30
GMAO/NSIPP NSIPP atmospheric GCM at 2°x2.5°xL34 coupled with NSIPP ocean
GCM at 2/3°x1.25°L20
NCAR/LANL
CCSM2 including CAM with Eulerian spectral dynamics and CLM at
T42L26 coupled with POP ocean and data ice model at 1°x1°L40
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Data Assimilation Applications
SOURCE
GMAO/DAO
APPLICATION
PSAS interface layer with 2O0K observations/day
CAM with finite volume dynamics at 2°x2.5°L55, including CLM
NCEP
Global atmospheric spectral model at T170L42
SSI analysis system with 250K observations/day, 2 tracers
WRF regional atmospheric model at 22km resolution CONUS forecast
345x569L50
GMAO/NSIPP ODAS with OI analysis system at 1.25°x1.25°L20 resolution with ~10K
observations/day
MIT
MITgcm 2.8° century / millennium adjoint sensitivity
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Interoperability Experiments:
8 New Applications
COUPLED CONFIGURATION
NEW SCIENCE ENABLED
GFDL B-grid atm / MITgcm ocn
Global biogeochemistry (CO2, O2), SI timescales.
GFDL MOM4 / NCEP forecast
NCEP seasonal forecasting system.
NSIPP ocean / LANL CICE
Sea ice model for extension of SI system to centennial time
scales.
NSIPP atm / DAO analysis
Assimilated initial state for SI.
DAO analysis / NCEP model
Intercomparison of systems for NASA/NOAA joint center for
satellite data assimilation.
DAO CAM-fv / NCEP analysis
Intercomparison of systems for NASA/NOAA joint center for
satellite data assimilation.
NCAR CAM Eul / MITgcm ocn
Improved climate predictive capability: climate sensitivity to
large component interchange, optimized initial conditions.
NCEP WRF / GFDL MOM4
Development of hurricane prediction capability.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
•
•
•
•
•
ESMF Overview
ESMF Applications
Related Projects
ESMF Architecture
Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Related Projects
• PRISM is an ongoing European Earth system
modeling infrastructure project
• Involves current state-of-the-art atmosphere,
ocean, sea-ice, atmospheric chemistry, landsurface and ocean-biogeochemistry models
• 22 partners: leading climate researchers and
computer vendors, includes MPI, KNMI, UK Met
Office, CERFACS, ECMWF, DMI
• ESMF is working with PRISM to merge
frameworks and develop common conventions
For joint use with PRISM, ESMF
developed a component database
to store component import/export
fields and component descriptions
• CCA is creating a minimal interface and sets of
tools for linking high performance components.
CCA can be used to implement frameworks and
standards developed in specific domains (such as
ESMF).
• Collaborators include LANL, ANL, LLNL, ORNL,
Sandia, University of Tennessee, and many more.
Ongoing ESMF collaboration with CCA/LANL on
language interoperability.
• Working prototype demonstrating CCA/ESMF
interoperability, to be presented at SC2003.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF
Connections
PRISM
WRF
CCA
ESMF
FMS
SWMF
SciDAC
ESMF Earth System Modeling Framework
CCA DOE Common Component Architecture
SciDAC DOE/NSF CCSM SciDAC Project
GEMS Goddard Earth Modeling System
FMS GFDL Flexible Modeling System
SWMF Space Weather Modeling Framework
WRF Weather Research and Forecast Model
CCSM Community Climate System Model
PRISM Program for Int. Earth System Modeling
CCSM
GEMS
Larson/ANL
DeLuca/NCAR-SCD
Jones/LANL
Stout/U Mich
Killeen/NCAR
Drake/ORNL
Boville/NCAR-CGD
Michalakes/NCAR-MMM
Suarez/NASA Goddard
Balaji/NOAA GFDL
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
•
•
•
•
•
ESMF Overview
ESMF Applications
Related Projects
ESMF Architecture
Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Computational Characteristics
Platforms
of Weather/Climate
• Mix of global transforms and local communications
• Load balancing for diurnal cycle, event (e.g. storm) tracking
• Applications typically require 10s of GFLOPS,
100s of PEs – but can go to 10s of TFLOPS, 1000s of PEs
• Required Unix/Linux platforms span laptop to
Earth Simulator
• Multi-component applications: component
hierarchies, ensembles, and exchanges
ocean
• Data and grid transformations between
components
• Applications may be MPMD/SPMD,
concurrent/sequential, combinations
• Parallelization via MPI, OpenMP, shmem, combinations
• Large applications (typically 100,000+ lines of source code)
Seasonal Forecast
sea ice
coupler
assim_atm
assim
atmland
atm
physics
land
dycore
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Architecture
1.
2.
ESMF provides an environment for
assembling geophysical components into
applications.
ESMF provides a toolkit that
components use to
i.
increase interoperability
ii.
improve performance
portability
iii.
abstract common services
Components Layer:
Gridded Components
Coupler Components
ESMF Superstructure
Model Layer
User Code
Fields and Grids Layer
ESMF Infrastructure
Low Level Utilities
External Libraries
BLAS, MPI, NetCDF, …
Becoming an ESMF Component
• Pack model import and export data into ESMF data structures and conform to a standard
calendar. Use ESMF utilities internally as desired. Organize model using standard ESMF
methods: Initialize, Run, Finalize, ReadRestart, WriteRestart. Methods may be multi-phase
(Run phase=1, Run phase=2). Method interfaces are prescribed.
• Instantiate an ESMF Component with name, type, config information. Register standard
model methods with Component. If desired, register data.
• Use ESMF AppDriver to sequence and run Components.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Features
ESMF enables modeling applications to be:
• Scalable
Models are built from modular components, and can
be easily nested within larger applications
• Performance - portable
ESMF high-performance communication libraries offer a
consistent interface across computer architectures
• Exchangeable
Standard component interfaces enable interoperability
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
ESMF Class Structure
GridComp
Land, ocean, atm, … model
CplComp
Xfers between GridComps Superstructure
State
Data imported or exported
Bundle
Collection of fields
Field
Physical field, e.g. pressure
Regrid
Computes interp weights
Infrastructure
Grid
LogRect, Unstruct, etc.
Array
Hybrid F90/C++ arrays
PhysGrid
Math description
DistGrid
Grid decomposition
DELayout
Communications
F90
Route
Stores comm paths
C++
Utilities
Machine, TimeMgr, LogErr, I/O, Config, Base etc.
Data
Communications
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Principle:
Local Communication
All inter-Component communication within ESMF is local - all communication is
handled within Components. This allows the architecture of the framework to be
independent of the communication strategy.
climate_comp
atm2ocn _coupler
ocn_comp
atm_comp
phys2dyn_coupler
atm_phys
atm_dyn
PE
As a consequence, Coupler
Components must be defined on the
union of the PEs of all the Components
that they couple.
In this example, in order to send data
from the atmosphere Component to the
ocean, the atm2ocn Coupler mediates
the send.
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Principle:
Scalable Applications
Since each ESMF application is also a Component, entire ESMF applications
can be treated as Gridded Components and nested within larger applications.
climate_comp
ocn2atm_coupler
ocn_comp
atm_comp
phys2dyn_coupler
atm_phys
atm_dyn
PE
Example: atmospheric application itself
composed of multiple Components may be
run standalone, or nested within a larger
climate application
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Principle: Modularity
Gridded Components don’t have access to the internals of other Gridded Components, and don’t store
any coupling information. Gridded Components may:
• pass their States to other Components through their argument list or
• receive user-defined methods through their argument list for transforming and transferring States.
These are called Transforms, and they contain a function pointer and attributes describing
frequency and validity criteria. Transforms may modify the data in States, receive States from other
Components, send States to other Components, etc.
EX 1: One-way coupling from atm to ocn
EX 2: Standalone atm
atm_xform is a send; ocn_xform a receive
atm_xform is a no-op
! From climate comp
call ESMF_CompRun(atm, atm_xform)
call ESMF_CompRun(ocn, ocn_xform)
! From atm comp
call ESMF_StateXform(atm_ex_state, &
atm_xform)
! From ocn comp
call ESMF_StateXform(ocn_im_state, &
ocn_xform)
! From atm comp
call ESMF_CompRun(ocn, ocn_xform)
call ESMF_StateXform(atm_ex_state, &
atm_xform)
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Design Principle:
Uniform Communication API
The same programming interface is used for shared memory, distributed memory, and
combinations thereof.
Machine model = abstraction of machine architecture (num_nodes, num_pes_per_node, etc.)
DE = a decomposition element - may be virtual, thread, MPI process
DELayout = an arrangement of DEs, in which dimensions requiring faster communication may
be specified and resources arranged accordingly
DELayout: 4 x 3, ESMF_COMM_SHR in x and
ESMF_COMM_MP in y
The data in a Grid is decomposed according to
the number and topology of DEs in the
DELayout
Performance of ESMF AllGather vs. raw MPI
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Outline
•
•
•
•
•
ESMF Overview
ESMF Applications
Related Projects
ESMF Architecture
Timeline
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
Time Line
May 2002
Draft Developer’s Guide and Requirements Document completed
1st Community Requirements Meeting and review held in D.C.
July 2002
ESMF VAlidation (EVA) suite assembled
August 2002
Architecture Document: major classes and their relationships
Implementation Report: language strategy and programming model
Software Build and Test Plan: sequencing and validation
May 2003
ESMF Version 1.0 release, 2nd Community Meeting at GFDL
November 2003
First 3 interoperability experiments completed
April 2004
Second API and production software release, 3rd Community Meeting
November 2004
All interoperability experiments complete; all testbed applications compliant
January 2005
Final delivery of source code and documentation
NSF NCAR / NASA GSFC / DOE LANL ANL / NOAA NCEP GFDL / MIT / U MICH
More Information
ESMF website: http://www.esmf.ucar.edu
Acknowledgements
The ESMF is sponsored by the NASA
Goddard Earth Science Technology Office.