The National Climate Predictions and Projections Platform Workshop on Quantitative Evaluation of Downscaled Climate Projections (August 12-16, 2013)

Download Report

Transcript The National Climate Predictions and Projections Platform Workshop on Quantitative Evaluation of Downscaled Climate Projections (August 12-16, 2013)

The National Climate Predictions and Projections Platform
Workshop on Quantitative
Evaluation of Downscaled
Climate Projections
(August 12-16, 2013)
Motivation: Practitioner’s Dilemma
• Practitioner’s dilemma - how to choose
among many available sources of climate
information for a given place and
application?
Needs
• Objective evaluation of datasets prepared
for use in planning for climate change.
• Provision of application-specific guidance
to improve usability of climate-change
projections in planning.
• Initiate a community based on standards
to build and sustain practice of evaluation
and informed use of climate-change
projections.
When, Where, Who
• August 12-16, 2013
• Boulder Colorado
• Participants
– Use cases from sectoral working groups
•
•
•
•
Agricultural impacts
Ecological impacts
Water resources impacts
Human health impacts
– Datasets from downscaling working groups
– NCPP Community, agency partners, program
sponsors, international observers, interested parties
Week at a Glance
Monday
12 August
Days 1 and 2 – Evaluation Focus
Tuesday
13 August
Wednesday
14 August
Day 3 – Transition
Thursday
15 August
Days 4 and 5 – Guidance Focus
Friday
16 August
Expected Outcomes
DATA
– Database for access to high-resolution datasets with standardized
metadata of downscaling methods
– Demonstration of flexible, transparent climate index calculation
service (Climate Translator v.0)
EVALUATION
– First version of a standardized evaluation capability and
infrastructure for high-resolution climate datasets, incl. applicationoriented evaluations
– Description of a sustainable infrastructure for evaluation services
COMMUNITIES OF PRACTICE
– Sector and problem-specific case studies within the NCPP
environment
– First version of a comparative evaluation environment to develop
translational and guidance information
– Identify value-added, remaining gaps and needs for further
development of evaluation framework and metadata, incl. templates
Evaluation: Downscaling working groups
BCSD
ARRM
BCCA
Statistical downscaling datasets
Hostetler data,
RegCM2
NARCCAP data,
Dynamical downscaling datasets
Delta
method
Baseline
GRIDDED OBSERVATIONAL DATA SETS
MACA
Guidance: Applications
• Application Use Cases
Water
resources
Ecological
Impacts
Agriculture
Health
impacts
– Identification of network of
application specialists
– Define representative questions to
focus the evaluations
– Representation of application needs:
Scales, Indices, etc.
– Feedback on guidance and
translational information needs
– Feedback on design / requirements
of software environment for
workshop
– Contribution to reports from
workshop
About 75 participants
• Downscaling working groups
– BCCA, BCSD, ARRM, NARCCAP, MACA, etc.
teams – approx. 20 people
• Sectoral working groups
– Agricultural impacts, Ecological impacts, Water
resources impacts, Human health impacts –
approx. 30 people
• NCPP Community – Executive Board,
Climate Science Applications Team, Core &
Tech Teams = approx. 18 people
• Program managers, reporters, international
guests – about 5 people
Week in More Detail
Monday
•
•
•
Tuesday
Wednesday
Thursday
Friday
•
Days 1 and 2 – EVALUATION focus
Intercomparison of downscaling methods
Fine tuning the evaluation framework – what worked and what did
not work?
Interpretation of results and development of guidance for user
groups
Identification of gaps and needs for downscaled data for the
participating applications
Day 3 – TRANSITION: EVALUATION and GUIDANCE
• Morning - Summary of the downscaling methods attributes and
evaluations results by sector and protocol
• Afternoon - Start of sectoral applications groups work
•
•
•
•
•
Days 4 and 5 – GUIDANCE focus
Interpretation of results and guidance for user groups
Presentation of metadata descriptions and their usage
Presentation of indices provision - OCGIS
Identification of gaps and needs for downscaled data for application
needs
Identification of future steps
Below are categories of
supplemental information
Days in More Detail
Proposed structure Monday
8:30-9:00: Breakfast and Coffee/Tea
9-9:30: Logistics, Welcome and introductions, Technicalities
–
–
–
(Catchy intro: RAL director? Head of NOAA over video?)
Brief introductions of workshop members
Technical logistics: internet access, ESG node and COG environment?
Overview of workshop, and key objectives
9:30-10:30: Key-Note: Practitioners Dilemma: A call from the desperate world for help
break
11-12:30: Evaluation approach of NCPP
–
–
Framework presentation of evaluation of downscaled projections data, protocols, standards
…
Introduction of version 0: How were the evaluations done, tools, images, metadata/CIM,
potential plans forward (DIBBs structure), working groups and workshops, … community of
practice
Lunch 12:30-2pm
2-3:30 pm: High-resolution data providers: observed and projected gridded information
–
–
What distinguishes your method and what were you trying to accomplish with this method?
(getting to value-added question)
Presentations from developers of downscaling methods and datasets
Break
4-5pm: Key discussion: Discussion of Directions of Downscaling
Proposed structure Tuesday
8:30-9:00: Breakfast and Coffee/Tea
9-10:30: Results from Evaluations : Data Perspective
–
–
Evaluation and characteristics of the baseline data: Observed gridded data comparisons to
station data and inter-comparisons – short presentations
Evaluation of the characteristics of the downscaled projections data: Downscaled projections
evaluation – presentations and discussion
break
11-12:30: continued
Lunch 12:30-2pm
2-3:30pm: Results from Evaluation: User Perspective
–
–
Short introduction of Applications needs
Case studies presentation and critique of evaluations
break
4-5pm: Key Discussion: Discussion of issues related to the framework
-
Next steps in fine tuning the evaluation framework – what worked and what did not work?
What else needs to be added? What needs to be changed? What does need to be done by
the developers of downscaled data – what gaps are there in relation to applications?
Proposed structure Wednesday
8:30-9:00: Breakfast and Coffee/Tea
9-10:00: Key Note: Downscaling for the World of Water (Maurer?)
10-10:30: Summary of first two days and future evaluation potential using Protocols 2 and
3
–
–
Summary first two days
Perfect Model experiments and evaluations
•
–
Presentation and discussions
Process-based metrics and evaluations
•
Presentation and discussions
break
11-12:30pm: User Communities
Lunch 12:30-2pm
break
4-5pm: Key discussion:
Day 4 and 5
More Detail on Participants and Partnerships
Partnership through downscaling working group
• GFDL – Perfect model experiments
– Keith Dixon, V. Balaji, A. Radhakrishnan
• Texas Tech Univeristy, SC CSC
– Katharine Hayhoe - ARRM
• DOI USGS, Bureau of Reclamation, Santa Clara
University, Scripps Institute, Climate Central,
NOAA/NWS
– E. Maurer, H. Hidalgo, D. Cayan, A. Wood - BCSD, BCCA
• University of Idaho
– J. Abatzoglou - MACA
• DOI USGS, Oregon State University
– S. Hostetler – RegCM2 - dynamically downscaled data
• NCAR
– Linda Mearns – NARCCAP - dynamically downscaled data
Partnerships through sectoral working groups
• Health impacts
–
–
–
–
–
NOAA/NWS – David Green
NYC Dept of Health – Dr. Shao Lin
NCAR – Olga Wilhelmi
Columbia University – Patrick Kinney
Univeristy of Florida – Chris Uejio
• Agricultural impacts
–
–
–
–
AGMIP
USDA
NIDIS
SE RISA
• Ecological impacts
– DOI USGS NC CSC
• Water resources impacts
– Bureau of Reclamation
– California ……
Partnership through
infrastructure, metadata and standards development
• ES-DOC
– IS-ENES, METAFOR project (CIM and CVs)
• NESII
– CoG, OCGIS
• EU CHARMe project (metadata archive and search)
• EU CORDEX (dynamical downscaling CV), NA
CORDEX (archive and metadata standardization)
• ESGF (data and images archiving)
• DOI-USGS (data access)
• GLISA (translational information archiving)
More Details on Protocols and Metrics
Downscaling working groups
BCSD
ARRM
BCCA
Statistical downscaling datasets
Hostetler data,
RegCM2
NARCCAP data,
Dynamical downscaling datasets
Delta
method
Baseline
MACA
Downscaling working groups
BCSD
Hostetler data,
RegCM2
BCCA
NARCCAP data,
ARRM
Dynamical
downscaling
datasets
MACA
Statistical downscaling
datasets
Delta
Method
Baseline
Evaluation framework:
Protocols and Metrics
Types of protocols
Observational
Perfect model
Validation by
comparison to
observed data
Comparison to a highresolution GCM; allows
evaluation of
nonstationarity
Idealized
scenarios
Comparison to
synthetic data with
known properties
Groups of metrics
Group 1
Group 2
Group 3
A standard set of
metrics calculated for
all methods describing
the statistical
distribution and
temporal
characteristics of the
downscaled data
Sets of metrics useful
for specific sectoral
and impacts
applications
Sets of metrics used to
evaluate climate
system processes and
phenomena
Water
resources
Southwest monsoon
Central tendency
Human
health
Extreme precipitation
processes
Ecological
impacts
Atmospheric rivers
Agricultural
impacts
Other extreme events
related processes
Tails of distribution
Variability
Temporal characteristics
More detailed architectural diagrams
• Original Vision of NCPP Architecture
• Commodity Governance (Cog) Earth
System Grid Federation (ESGF)
Infrastructure to Support 2013 Workshop
• OpenClimateGIS Systems Figure
Original Vision of NCPP Architecture:
Summer 2011
Not complete or final!
Information
Interpretation
NCPP website, project workspaces for communities of practice
CoG for community connections
Interface layer
Service layer
Resource layer
Support for inter-comparison projects and
workflows representing solution patterns
Curator display, CoG
Composition and display of guidance
documents and other text related to the use of
climate data
climate.gov approaches
Downscaling and data formatting services,
visualization, faceted data search, bookmarking
OpenClimateGIS, LAS, ESGF search, USGS
tools, ENSEMBLES
Search and semantic services associated with
web content and other sources
Consiliate, Drupal database tools
Federated data archival and access
ESGF, THREDDS, data.gov platforms
Data at USGS, PCMDI, NASA, NOAA, …
Federated metadata collection and display
Curator tools, METAFOR, Kepler and other
workflow solutions
27
CoG ESGF Infrastructure to
support 2013 Workshop
OpenClimateGIS Systems Figure
Design Considerations:
Climate Translator V.0
Design Considerations: Climate Translator V.0
Indices
Predefined; Defined by
users
Multiple Basic Data Archives
USGS GeoDataPortal
Earth System Grid …
Geography
Define Locality; GIS;
Web Mapping
Evaluation
Protocols, Metrics,
Observations
Analysis & Synthesis of
Information
Definitions, Sources, Metadata,
Fact Sheets, Narratives, Guidance
NCPP Architecture
What Date Goes Where
Primary Data
Existing downscaled datasets; Validation
datasets (observations or hi-res model
output)
ESGF (local)
Other OpenDAP
(e.g. Geodata
Portal)
Local disk (may be at
NOAA, NCAR, or at
scientist’s institution)
Location of objects are color coded
CIM documents
Downscaled Datasets
Downscaling Model
Components
Downscaling Simulations
and Ensembles
Experiments
Quantitative Evaluation
computation of indices, if
not already available;
computation of metrics
according to NCPP protocols;
Products of QED
New Indices
Datasets (?)
Run the
Evaluation Code:
NCL; Python (?)
Code Repository linked to COG
Environment (ideally)
Orange = COG or other NCPP database
Index/Metric Code
Processor Component
Evaluation Protocols
Experiment (e.g. NCPP
Protocol 1)
Evaluation Data
Bundles
Image Bundles
(ESGF? Local
Database?)
Gray = “Don’t know yet”
Groups of Metrics
? Experiment ?
Products of Workshop/working groups
Evaluation Data
Bundles
Image Bundles
(ESGF? Local
Database?)
Further
Visualization
Other Images
(unstructured)
Expert
analysis
Text (structured
case studies;
other text)
Search and
Compare
CIM
COG Wiki and Linked Tools
COG Wiki and Linked Tools or
GLISA-like CMS/database ???
Integrate with other NCPP
translational info
Translational/Interpretive
CIM document?
Design Considerations
• These plots were to help define the
computational environment to support
Workshop 2013. (Read note sections of
slides.)
– Focus on evaluation of existing data products
– Linking to protocols and metrics development of
capability to compare and describe gridded data
systems
– Separate the output interface in types to facilitate
development of services versus internal NCPP
environment
Two Classes of Evaluation
Evaluation of
Methodology
• Important for Data Set
Developers
• Informs uncertainty
description and translation
• “Perfect Model”
strategically central
Evaluation of
Data Products
• Important for End Users
• Informs Data Set
Developers
• Definable problem with our
resources
• Fundamental descriptions
are of value and support
NCPP’s mission
2013 Workshop
Focus - Evaluation of Data
Products
Evaluation of
Data Products
• Important for End Users
• Informs Data Set
Developers
• Definable problem with our
resources
• Fundamental descriptions
are of value and support
NCPP’s mission
• Quantified Description
Environment (QDE)
• Focus on T and P,
quantify differences in
standard data sets.
– Data set choice criteria
– Meaningful
Contribution
• Standard treatment
across datasets
– Gridded
• What is in the
literature?
Quantified Description Environment
(QDE)
Output
Input
Calculation
Input
QDE: Input
Station Data
(observations)
Gridded Data
Observations
Models
QDE: Output
Output
Research Environment
Support of Services
End-User / Us & Not Us
Digital Data
Primary Data
Derived Data
Non-Digital Data
Software
Descriptions
Structured
Unstructured
Environments:
Us & Not Us
Analysis
Collaborative
End-user
2013 Workshop and NCPP
NCPP Strategy and Projects
• Workshop in 2013 is starts a progression of
workshops that focus the overall evaluation
activity and strategy of NCPP
NCPP Strategy and Projects
• Workshop in 2013
– a focal point and an integration of all NCPP projects
– start of a progression of workshops that focus the
overall evaluation activity and strategy of NCPP
Climate
Indices
Integration
NCPP Software
Environment
Downscaling
Evaluation
Workshop
2013
NC CSC
Downscaling
Metadata
Interagency
Community
Workshop
Goals
• Quantitative
evaluation
• Infrastructure
support
• Description,
guidance and
interpretation
• Informed decisionmaking
Principles and
values
• Standardization
• Reproducibility and
transparency
• Comparability of
methods
• Extensibility
• Co-development
Contributions to NCPP development goals
I.
Evaluation standards
– Develop a suite of evaluation metrics for downscaled data
– Design a common suite of tests to evaluate downscaling methods
II.
Guidance documents
– Produce guidance on the advantages and limitations of various downscaling
techniques for specific user applications based on the quantitative evaluations
– Inform development of standard, vetted downscaled climate prediction and projection
products
III.
Translation for users and development of metadata
– Educate users in the evaluation and application of downscaled climate prediction and
projection products
– Develop searchable structured metadata to describe downscaling methods as well as
their evaluations
– Develop an initial platform for data discovery, exploration, and analysis that serves and
makes use of the translational information
IV. Cyber infrastructure
– Develop information technology infrastructure to support community analysis and
provision of climate information.