Earth System Modeling Framework Workshop on “Coupling Technologies for Earth System Modelling : Today and Tomorrow” CERFACS, Toulouse (France) – Dec 15th to.

Download Report

Transcript Earth System Modeling Framework Workshop on “Coupling Technologies for Earth System Modelling : Today and Tomorrow” CERFACS, Toulouse (France) – Dec 15th to.

Earth System Modeling Framework

Workshop on “Coupling Technologies for Earth System Modelling : Today and Tomorrow” CERFACS, Toulouse (France) – Dec 15 th to 17 th 2010 Ryan O’Kuinghttons Robert Oehmke Cecelia DeLuca

Motivation

In climate research and numerical weather prediction.. increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall modeling system In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable computing architectures In software … emergence of frameworks to promote code reuse and interoperability The ESMF is a focused community effort to tame the complexity of models and the computing environment. It leverages, unifies and extends existing software frameworks, creating new opportunities for scientific contribution and collaboration.

Evolution

Phase 1: 2002-2005

NASA ’ s Earth Science Technology Office ran a solicitation to develop an

Earth System Modeling Framework

(ESMF).

A multi-agency collaboration (NASA/NSF/DOE/NOAA) won the award. The core development team was located at NCAR.

A prototype ESMF software package (version 2r) demonstrated feasibility.

Phase 2: 2005-2010

New sponsors included Department of Defense and NOAA.

Many new applications and requirements were brought into the project, motivating a complete redesign of framework data structures (version 3r).

Phase 3: 2010-2015

The core development team moved to NOAA/CIRES for closer alignment with federal models.

Basic framework development will be complete with version 5r (ports, bugs, feature requests, interoperability templates, user support etc. still require resources).

The focus is on increasing adoption and creating a community of interoperable codes.

Components

• • • ESMF is based on the idea of components – functionally distinct sections of code that are wrapped in standard interfaces Components may represent either a physical domain or a function Components can be arranged hierarchically, helping to organize the structure of complex models ESMF components in the GEOS-5 atmospheric GCM

Architecture

• • ESMF provides a superstructure for assembling geophysical components into applications.

ESMF provides an infrastructure that modelers use to – – – Generate and apply interpolation weights Handle metadata, time management, data I/O and communications, and other functions Access third party libraries

Components Layer

Gridded Components Coupler Components Model Layer

Fields and Grids Layer Low Level Utilities

External Libraries

ESMF Superstructure

User Code

ESMF Infrastructure

MPI, NetCDF, …

Standard Interfaces

• • All ESMF components have the same three standard methods: – Initialize – Run – Finalize Each standard method has the same simple interface: Steps to adopting ESMF • Divide the application into components (without ESMF) • Copy or reference component input and output data into ESMF data structures • Register components with ESMF • Set up ESMF couplers for data exchange call ESMF_GridCompRun (myComp, importState, exportState, clock, …) Where:

myComp

points to the component

importState

is a structure containing input fields

exportState

is a structure containing output fields

clock

contains timestepping information • Interfaces are wrappers and can often be setup in a non-intrusive way

Data Representation Options

• • • 1. Representation in index space (Arrays) One or more tiles store indices and topology Sparse matrix multiply for remapping with user supplied interpolation weights Highly scalable - no global information held locally, uses distributed directory approach (Devine 2002) for access to randomly distributed objects in an efficient, scalable way Supported Array distributions • • • 2. Representation in physical space (Fields) Built on Arrays + some form of Grid Grids may be logically rectangular, unstructured mesh, or observational data Remapping using parallel interpolation weight generation - bilinear, higher order, or first order conservative

Coupling Options

• • • • • •

Lots of flexibility in coupling approaches

Single executable

Multiple executable options

Multiple executable – Array send/rcv with InterComm package (PVM)

Coupler

Comp A Comp B

Contributed by U Maryland

Comp A Comp B – – Web service option Other options

Array send/recv

Coupling communications can be called either from within a coupler or directly from a gridded component – useful when it is inconvenient to return from a component in order to perform a coupling operation Recursive components for nesting higher resolution regions Ensemble management with either concurrent or sequential execution of ensemble members

Metadata Handling and Usage

• Metadata is broken down into name/value pairs by the Attribute class – Can be attached at any level of the ESMF object hierarchy – Document data provenance to encourage self describing models – Automate some aspects of model execution and coupling - Actively exploring in this direction with workflows and web services • Standard metadata is organized by Attribute packages – Used to aggregate, store, and output model metadata – Can be nested, distributed, and expanded to suite specific needs - Designed around accepted metadata standards • Emerging conventions – Climate and Forecast (CF) – ISO standards – METAFOR Common Information Model (CIM)

Summary of Features

• • • • • • • • • Fast parallel remapping: unstructured or logically rectangular grids, 2D and 3D, using bilinear, higher order, or conservative methods, integrated (during runtime) or offline (from files) Multiple strategies for support of nested grids Core methods are scalable to tens of thousands of processors Supports hybrid (threaded/distributed) programming for optimal performance on many computer architectures Multiple coupling and execution modes for flexibility Time management utility with many calendars (Gregorian, 360-day, no leap, Julian day, etc.), forward/reverse time operations, alarms, and other features Metadata utility supports emerging standards in flexible and useful way Runs on 25+ platform/compiler combinations, exhaustive test suite and documentation Couples between Fortran and/or C-based model components

Class Structure

Data Array GridComp Land, ocean, atm, … model State Data imported or exported FieldBundle Collection of fields Field Physical field, e.g. pressure Grid, LocStream, Mesh (C++) LogRect, Unstruct, etc.

CplComp Xfers between GridComps Superstructure Infrastructure Hybrid F90/C++ arrays Xgrid Exchange grid Utilities DistGrid Grid decomposition DELayout Communications Machine, TimeMgr, LogErr, I/O, Config, Attributes etc.

Regrid Computes interp weights Route Stores comm paths F90 C++ Communications

Component Overhead

• • Representation of the overhead for ESMF wrapped native CCSM4 component For this example, ESMF wrapping required NO code changes to scientific modules •

No significant performance overhead

(< 3% is typical) • Few code changes for codes that are modular • • • • Platform: IBM Power 575, bluefire, at NCAR Model: Community Climate System Model (CCSM) Versions: CCSM_4_0_0_beta42 and ESMF_5_0_0_beta_snapshot_01 Resolution: 1.25 degree x 0.9 degree global grid with 17 vertical levels for both the atmospheric and land model, i.e. 288x192x17 grid. The data resolution for the ocean model is 320x384x60.

Remapping Performance

• • • • All ESMF interpolation weights are generated with unstructured mesh Increases flexibility with 2D and 3D grids Adds overhead to bilinear interpolation Greatly improves performance over existing conservative methods • •

ESMF parallel conservative remapping is scalable and accurate Bilinear could use additional optimization

• • • Platform: Cray XT4, jaguar, at ORNL Versions: ESMF_5_2_0_beta_snapshot_07 Resolution: - fv0.47x0.63: CAM Finite Volume grid, 576x384 - ne60np4: 0.5 degree cubed sphere grid, 180x180x6

A Common Model Architecture

A Common Model Architecture • The US Earth system modeling community is converging on a common modeling architecture • Atmosphere, ocean, sea ice, land, wave, and other models are ESMF or ESMF-like components called by a top-level driver or coupler • Many models are componentizing further

ESMF-enabled systems include:

Navy Coupled Ocean Atmosphere Mesoscale Prediction System / Wavewatch III Navy Operational Global Atmospheric Prediction System Hybrid Coordinate Ocean Model – CICE Sea Ice NASA GEOS-5 Atmospheric General Circulation Model NOAA National Environmental Modeling System NOAA GFDL MOM4 Ocean NCAR Community Earth System Model Weather Research and Forecast Model HAF Kinematic Solar Wind-GAIM Ionosphere pWASH123 Watershed-ADCIRC Storm Surge Model

Features and Benefits:

• • • Interoperability promotes code reuse and cross-agency collaboration Portable, fast, fully featured toolkits enhance capability Automatic compliance checking for ease of adoption

A Common Model

Legend Ovals show ESMF components and models that are at the working prototype level or beyond.

POP Ocean Param Chem CICE Ice CCSM4/CESM NOGAPS Strat Chem CLM Land HYCOM CAM Atm GFS I/O NOAA Department of Defense University FIM NASA • Increasingly, models in the GEOS-5 NEMS NMM-B Atm Dynamics U.S. follow a common NMM-B Atm Phys ESMF coupling complete NMM History architecture • Atmosphere, ocean, sea ice, land, and/or wave models are GOCART GEOS-5 Hiistory GEOS-5 GWD GEOS-5 Atm Chem GEOS-5 Atm Dynamics GEOS-5 FV Dycore FV Cub Sph Dycore level driver/coupler • Components use ESMF or GEOS-5 Atm Physics GEOS-5 Aeros Chem ESMF-like interfaces (see left) • Many major U.S. weather and GEOS-5 LW Rad GEOS-5 Solar Rad climate models either follow GEOS-5 Surface Even non ESMF codes now look like ESMF … GEOS-5 Topology GEOS-5 Land this architecture GEOS-5 Veg Dyn GEOS-5 Catchment GEOS-5 Lake architecture for future coupled WWIII CESM (non-ESMF version): atm_run_mct(clock, gridcomp, importState, exportState) COAMPS

(argument names changed to show equivalence)

NCOM pWASH123 ADCIRC SWAN FMS)

NUOPC Layer: Goals

• • • National Unified Operational Prediction Capability is a consortium of operation weather prediction centers Standardize implementation of ESMF across NASA, NOAA, Navy, and other model applications Demonstrate improved level of interoperability – Specific goals described in the NUOPC Common Model Architecture Committee report – CMA report http://www.weather.gov/nuopc/CMA_Final_Report_1%20Oct% 2009_baseline.pdf?

– NUOPC website - http://www.weather.gov/nuopc

NUOPC Layer: Products

• • • • • Software templates to guide development of a common architecture for components and couplers A software layer to narrow the scope of ESMF interfaces NUOPC Compliance Checker software (initial implementation available with ESMF_5_1_0) Comprehensive tutorial materials Websites, repositories, trackers, and other collaborative tools • NUOPC Layer Guidance Documents (posted on ESMF website) • ESMF - www.earthsystemmodeling.org

Other ESMF related projects …

• • • Earth System Curator (sponsors NSF/NASA/NOAA) – Implementation of the METAFOR Common Information Model in the Earth System Grid (ESG) portal for the 5 th Coupled Model Intercomparison Project – Using ESMF Attributes to generate the METAFOR CIM schema directly from models – Atmosphere/hydrological model coupling using OpenMI and ESMF web services Earth Science Gateway on the TeraGrid (sponsor NSF) – End-to-end self-documented workflows from web-based model configuration to data archival, with Purdue and NCAR Global Interoperability Program (sponsor NOAA) – Support for projects involved with interoperability, infrastructure development, and global modeling education