Coordination of Common Modeling Infrastructure Data Assimilation Cecelia DeLuca WGCM/WMP Meeting, Exeter, UK [email protected] Oct 6, 2005 Climate Weather.

Download Report

Transcript Coordination of Common Modeling Infrastructure Data Assimilation Cecelia DeLuca WGCM/WMP Meeting, Exeter, UK [email protected] Oct 6, 2005 Climate Weather.

Coordination of Common
Modeling Infrastructure
Data
Assimilation
Cecelia DeLuca
WGCM/WMP Meeting, Exeter, UK
[email protected]
Oct 6, 2005
Climate
Weather
Outline
•
•
•
•
•
•
What is ESMF?
How Do ESMF and PRISM Differ?
Why Do ESMF and PRISM Differ?
Can ESMF and PRISM Be Usefully Combined?
Model Metadata and Earth System Curator
How Can WMP Help?
ESMF Background
ESMF grew out of the now defunct Common Modeling Infrastructure Working Group,
which involved many operational and research centers in the U.S.
(Steve Zebiak and Robert Dickenson chairs).
Three linked proposals were
funded by NASA ESTO in 2002:
1. Core framework
(Killeen/NCAR)
2. Modeling applications
(Marshall/MIT)
3. Data assimilation applications
(da Silva/NASA GSFC)
Original ESMF applications:
NOAA GFDL atmospheres
NOAA GFDL MOM4 ocean
NOAA NCEP atmosphere, analyses
NASA GMAO models and GEOS-5
NASA/COLA Poseidon ocean
LANL POP ocean and CICE
NCAR WRF
NCAR CCSM
MITgcm atmosphere and ocean
New ESMF-Based Programs
Funding for Science, Adoption, and Core Development
Modeling, Analysis and Prediction Program for
Climate Variability and Change
Sponsor: NASA
Partners:
University of Colorado at Boulder, University of Maryland,
Duke University, NASA Goddard Space Flight Center,
NASA Langley, NASA Jet Propulsion Laboratory,
Georgia Institute of Technology, Portland State
University, University of North Dakota, Johns Hopkins
University, Goddard Institute for Space Studies,
University of Wisconsin, Harvard University, more
The NASA Modeling, Analysis and Prediction Program
will develop an ESMF-based modeling and analysis
environment to study climate variability and change.
Integrated Dynamics through Earth’s
Atmosphere and Space Weather Initiatives
Sponsors: NASA, NSF
Partners: University of Michigan/SWMF, Boston
University/CISM, University of Maryland, NASA
Goddard Space Flight Center, NOAA CIRES
ESMF developers are working with the University of
Michigan and others to develop the capability to couple
together Earth and space software components.
Battlespace Environments Institute
Sponsor: Department of Defense
Partners:
DoD Naval Research Laboratory, DoD Fleet Numerical,
DoD Army ERDC, DoD Air Force Air Force Weather Agency
The Battlespace Environments Institute is developing
integrated Earth and space forecasting systems that use
ESMF as a standard for component coupling.
Spanning the Gap Between Models and
Datasets:
Earth System Curator
Sponsor: NSF
Partners:
Princeton University, Georgia Institute of Technology,
Massachusetts Institute of Technology, PCMDI, NOAA
GFDL, NOAA PMEL, DOE ESG
The ESMF team is working with data specialists to create
an end-to-end knowledge environment that encompasses
data services and models.
What is ESMF?
• ESMF provides tools for turning model codes
into components with standard interfaces and
standard drivers.
• ESMF provides data structures and common
utilities that components use for routine
services such as data communications,
regridding, time management, configuration,
and message logging.
ESMF Superstructure
AppDriver
Component Classes: GridComp, CplComp, State
User Code
ESMF Infrastructure
Data Classes: Bundle, Field, Grid, Array
Utility Classes: Clock, LogErr, DELayout, Machine
Outputs and outcomes …
•
Open-source, collaboratively developed software utilities and coupling interfaces,
exhaustive test suite, documentation, support and training.
•
A federation of geophysical components that can be assembled in multiple ways,
using different drivers and different couplers.
•
A Earth science organization that has focused interactions at many levels: software
engineer and support scientist, technical and scientific manager, scientist, director,
sponsor.
•
An extended community with strong connections and many diverse science options.
ESMF Components and Couplers
Application Example:
GEOS-5 AGCM
•
•
•
•
•
Each box is a user-written ESMF component
Every component has a standard interface so that it is (technically) swappable
Data in and out of components are packaged as state types with user-defined fields
New components can easily be added to the hierarchical system
Many different structures can be assembled by switching the tree around
But!
• It is possible to do a “wrap” of an existing model with ESMF, without needing to change
I
internal data structures, by just creating one Component box
• This is generally lightweight in terms of performance
• Users can choose to use all of ESMF or just some of it
• Measures overhead of
ESMF superstructure in
NCEP Spectral Statistical
Analysis (SSI), ~1%
overall
• Run on NCAR IBM
• Runs done by JPL staff,
confirmed by NCEP
developers
ESMF Development Status
• Concurrent or sequential execution, single or multiple executable
• Support for configuring ensembles
• Logically rectangular grids with regular and arbitrary distributions can be
represented and regular distributions can be regridded
• On-line parallel regridding (bilinear, 1st order conservative) implemented and
optimized
• Other parallel methods - e.g. halo, redistribution, low-level comms implemented
• Utilities such as time manager, logging, and configuration manager usable and
adding features
• Fortran interfaces and complete documentation, some C++ interfaces
ESMF software is not yet a hardened, out-of-the-box
solution
ESMF Platform Support
•
•
•
•
•
•
•
•
•
•
•
•
•
IBM AIX (32 and 64 bit addressing)
SGI IRIX64 (32 and 64 bit addressing)
SGI Altix (64 bit addressing)
Cray X1 (64 bit addressing)
Compaq OSF1 (64 bit addressing)
Linux Intel (32 and 64 bit addressing, with mpich and lam)
Linux PGI (32 and 64 bit addressing, with mpich)
Linux NAG (32 bit addressing, with mpich)
Linux Absoft (32 bit addressing, with mpich)
Linux Lahey (32 bit addressing, with mpich)
Mac OS X with xlf (32 bit addressing, with lam)
Mac OS X with absoft (32 bit addressing, with lam)
Mac OS X with NAG (32 bit addressing, with lam)
•
User-contributed g95 support
Current Challenges
Refocus core development team
• Base infrastructure is complete – now need support for unstructured grids,
multi-block grids with complex boundary behavior (e.g. tripole, cubed sphere),
more regridding options, and constructs for data assimilation
• Team composition must change correspondingly
• Better, smarter testing – suite of 1600 unit tests, 15 system tests, 30+ examples
still needs supplements
• Major increase in demand for customer support and training
Many new requirements
• Commercial tool for tracking requirements (DOORS)
• New representative body for prioritizing development tasks (Change Review
Board)
Organizationally and technically, ESMF infrastructure
will take another 3-5 years to mature
ESMF v PRISM
Run-time environment
PRISM
ESMF Superstructure
Coupling Superstructure
AppDriver
Component Classes: GridComp, CplComp, State
User Code
Code
User
ESMF Infrastructure
Utility Infrastructure
Data Classes: Bundle, Field, Grid, Array
Utility Classes: Clock, LogErr, DELayout, Machine
ESMF
Other Differences …
ESMF
PRISM
Seasonal Forecast
Coupler
Comp
ocean
coupler
sea ice
assim_atm
assim
atmland
atm
land
• Components are generally in the same
executable
• Components are often nested
• Multiple couplers
• Data is passed through states at the
beginning and end of method
execution
Comp
Comp
Comp
• Components are generally in separate
executables
• Components are generally not nested
• Single coupler
• Data is transferred through put/get
• Data can go from anywhere to
anywhere in another component
Motivation for Common Modeling
Infrastructure
PRISM
• Support for modeling workflows (e.g. job submission, version
control, annotation and archival of experiments, integration with
visualization and analysis tools)
• Model intercomparison and interchange of model components
• Better utilization of compute resources and performance
optimization
• Cost effectiveness: shared, fully featured common utilities (e.g.
logging, timing, regridding, calendars, I/O, parallelization tools)
• Systematic internal architecture of multi-component models,
support for many different drivers and configurations
ESMF
Why Do ESMF and PRISM Differ?
For both ESMF and PRISM, overall design was decided by a large
group of experienced modelers… so how did the two efforts wind
up with such different solutions?
• PRISM single-driver approach leads to greater effective
interoperability for a constrained (climate) domain
• ESMF approach leads to limited interoperability for a broader set
of domains: climate, weather, space weather, data assimilation –
support for seamless prediction
Both ESMF and PRISM face similar requirements – but
have taken different paths to fulfill them
Can ESMF and PRISM be Usefully
Combined?
•
•
•
•
ESMF can use PRISM run-time elements
PRISM can use the ESMF utility layer
ESMF can offer a put/get paradigm for greater flexibility
ESMF components can be described using PRISM PMIOD
files (XML description of model inputs/outputs and content),
and ESMF data transfers expressed as PRISM put/gets, so
that the same component can run in both systems (done
with MOM4)
Model Metadata and Earth
System Curator
Earth System Curator takes the interaction of ESMF/PRISM a step further:
• Recognize models and datasets are described by similar metadata
• Develop standards for model metadata, especially in the area of grids
• Work with umbrella groups developing metadata standards (e.g. GO-ESSP) to
integrate model and data metadata
• Work with groups developing ontologies (LEAD, ESML) to invest metadata
standards with structure and flexibility
• Work with GFDL, CCSM and PCMDI to link databases that store models,
experiments, and data to serve MIPs and IPCC
Anticipated result:
• Coordinated growth of ESMF and PRISM
• Opportunities to develop smarter tools (e.g. compatibility, assembly) based on
metadata information
How Can WMP Help?
• Support and promote common modeling infrastructure
◦ Maintain a science-driven methodology
◦ Emphasize long-term investment and continuity
◦ Communicate expectations – the “plug and play” myth
• Support and promote efforts to generate metadata standards
and ontologies
◦ For the interaction of ESMF and PRISM
◦ For the development of a more comprehensive and useful
modeling environment
• Help determine how to utilize infrastructure as an entry point into
the broader (international) modeling community