Forecast Verification Research Beth Ebert and Laurie Wilson, JWGFVR co-chairs WWRP-JSC meeting, Geneva, 21-24 Feb 2011

Download Report

Transcript Forecast Verification Research Beth Ebert and Laurie Wilson, JWGFVR co-chairs WWRP-JSC meeting, Geneva, 21-24 Feb 2011

Forecast Verification Research
Beth Ebert and Laurie Wilson, JWGFVR co-chairs
WWRP-JSC meeting, Geneva, 21-24 Feb 2011
Aims
Verification component of WWRP, in collaboration with
WGNE, WCRP, CBS
• Develop and promote new verification methods
• Training on verification methodologies
• Ensure forecast verification is relevant to users
• Encourage sharing of observational data
• Promote importance of verification as a vital part of
experiments
• Promote collaboration among verification scientists,
model developers and forecast providers
2
Working group members
Beth Ebert (BOM, Australia)
Laurie Wilson (CMC, Canada)
• Barb Brown (NCAR, USA)
• Barbara Casati (Ouranos, Canada)
• Caio Coelho (CPTEC, Brazil)
• Anna Ghelli (ECMWF, UK)
• Martin Göber (DWD, Germany)
• Simon Mason (IRI, USA)
• Marion Mittermaier (Met Office, UK)
• Pertti Nurmi (FMI, Finland)
• Joel Stein (Météo-France)
• Yuejian Zhu (NCEP, USA)
3
FDPs and RDPs
Sydney 2000 FDP
Beijing 2008 FDP/RDP
MAP D-PHASE
SNOW-V10 RDP
Typhoon Landfall FDP
Sochi 2014
Severe Weather FDP
4
Beijing 2008 FDP
Real Time Forecast Verification (RTFV) system
Fast qualitative and quantitative feedback on forecast
system performance in real time
– Verification products generated whenever new observations
arrive
Ability to inter-compare forecast systems
3 levels of complexity
– Visual (quick look)
– Statistics (quantitative)
– Diagnostic (more information)
5
Training
In person
Online
6
B08FDP lessons for
real time verification
• Real time verification considered very useful
• Forecasters preferred scatterplots and quantile-quantile plots
• Format and standardization of nowcasts products was critical to
making a robust verification system
• Difficult to compare "like" products created with different aims
(e.g., QPF for warning vs hydrological applications)
• Verification system improvements
– User-friendly web display
– More user options for exploring results
7
SNOW-V10
• Verification strategy
– User-oriented verification for
Olympic period of all forecasts,
tuned to decision points of VANOC
– Verification of parallel model
forecasts for Jan to August 2010
– Nowcast and regional model
verification
• Rich dataset
8
Suggested categories for
SNOW-V10 verification
Table 5 (2nd Revised Suggestion for SNOW-V10 Verification)
Variable
Cat 1
Cat 2
Cat 3
Cat 4
Cat 5
Cat 6
Cat 7
Cat 8
Temperature
(°C)
-25 <
-25≤ T<-20
-20≤ T<-4C
-4≤ T<-2
-2≤ T< 0
0≤ T< +2
+2 ≤ T< +4
≥ +4
RH (%)
< 30%
30≤ RH<
65%
65≤ RH<
90%
90≤ RH<
94%
94≤ RH<
98%
≥ 98%
Winds (m/s)
<3
3≤ w<4
4≤ w<5
5≤ w<7
7 ≤ w < 11 11 ≤ w < 13 13 ≤ w < 15 15 ≤ w < 17
≥ 17
Wind Gust
(m/s)
<3
3≤ w<4
4≤ w<5
5≤ w<7
7 ≤ w < 11 11 ≤ w < 13 13 ≤ w < 15 15 ≤ w < 17
≥ 17
69 ≤ d <
114º (E)
114 ≤ d <
159º (SE)
159 ≤ d <
204º (S)
204 ≤ d <
249º (SW)
200 ≤ v <
300
300 ≤ v <
500
≥ 500
Wind
Direction
d ≥ 339 & d 24 ≤ d < 69º
< 24º (N)
(NE)
Visibility (m)
v < 30
30 ≤ v < 50 50 ≤ v < 200
Ceiling (m)
c < 50
50 ≤ c< 120 120 ≤ c< 300 300 ≤ c< 750
750 ≤ c<
3000
c ≥ 3000
Precip Rate
0 < r ≤ 0.2 0.2 < r ≤ 2.5 2.5 < r ≤ 7.5
r = 0 (None)
(mm/hr)
(Trace)
(Light)
(Moderate)
r > 7.5
(Heavy)
Mixed
(w/Liquid)
Precip Type
No Precip
Liquid
Freezing
Frozen
249 ≤ d <
294º (W)
Cat 9
294 ≤ d <
339º (NW)
-
-
-
-
-
-
-
-
-
Unknown
-
-
-
9
Example:
Visibility verification
Forecast
< 30
30 ≤ x < 50
50 ≤ x < 200
200 ≤ x < 300
300 ≤ x < 500
> 500
Total
< 30
0
0
0
0
0
0
0
lam1k Min. Visibility (m) at VOL HSS=0.095
Observed
30 ≤ x < 50 50 ≤ x < 200 200 ≤ x < 300 300 ≤ x < 500
0
0
0
0
0
0
0
0
0
52
20
22
0
76
18
19
1
26
15
12
9
831
246
170
10
985
299
223
> 500
0
0
43
103
60
3743
3949
Total
0
0
137
216
114
4999
5466
10
Sochi 2014
Standard verification
Possible verification innovations:
• Road weather forecasts
• Real-time verification
• Timing of events – onset, duration, cessation
• Verification in the presence of observation uncertainty
• Neighborhood verification of high-resolution NWP,
including in time-height plane
• Spatial verification of ensembles
• User-oriented probability forecast verification
11
Collaboration
• WWRP working groups
• THORPEX
– GIFS-TIGGE
– Subseasonal prediction
– Polar prediction
• CBS
– Severe Wx FDPs
– Coordination Group on Forecast Verification
• SRNWP
• COST 731
• ECMWF TAC subgroup on verification measures
12
Spatial Verification Method
Intercomparison Project
• International comparison of many
new spatial verification methods
• Methods applied by researchers to
same datasets (precipitation;
perturbed cases; idealized cases)
• Subjective forecast evaluations
• Workshops: 2007, 2008, 2009
• Weather and Forecasting special
collection
http://www.rap.ucar.edu/projects/icp
13
Spatial Verification Method
Intercomparison Project
14
Spatial Verification Method
Intercomparison Project
• Future variables
– "Messy" precipitation
– Wind
– Cloud
• Future datasets
– MAP D-PHASE / COPS
– SRNWP / European data
– Nowcast dataset(s)
• Verification test bed
15
Publications
Recommendations for verifying deterministic and probabilistic
quantitative precipitation forecasts
Recommendations for verifying cloud forecasts (this year)
Recommendations for verifying tropical cyclone forecasts (next year)
January 2008 special issue of Meteorological Applications on
forecast verification
2009-2010 special collection of Weather & Forecasting on
spatial verification
DVD from 2009 Helsinki Verification Tutorial
16
Outreach
• Verification workshops
and tutorials
http://www.cawcr.gov.au/projects/verification/
– On-site, travelling
• EUMETCAL training
modules
• Verification web page
• Sharing of tools
17
International Verification
Methods Workshops
4th Workshop – Helsinki 2009
Tutorial
• 26 students from 24 countries
• 3 days
• Lectures, hands-on (took tools home)
• Group projects - presented at workshop
Workshop
• ~100 participants
• Topics:
–
–
–
–
–
–
–
User-oriented verification
Verification tools & systems
Coping with obs uncertainty
Weather warning verification
Spatial & scale-sensitive methods
Ensembles
Evaluation of seasonal and climate
predictions
18
5th International Verification
Methods Workshop
• Melbourne, December 2011
• 3-day tutorial + 3-day
scientific workshop
• Additional tutorial foci
– Verifying seasonal predictions
– Brief intro to operational
verification systems
• Capacity building for
FDPs/RDPs, SWFDP, etc.
19
New focus areas for
JWGFVR research
"Seamless verification" - consistent across space/time scales
Spatial scale
global
NWP
regional
local
subseasonal
seasonal prediction
prediction
very
short
range
decadal
prediction
climate
change
Approaches:
• deterministic / categorical
• probabilistic
• distributional
• other?
nowcasts
point
minutes
hours
days
weeks
months
Forecast lead time
years
decades
20
New focus areas for
JWGFVR research
Spatial methods for verifying ensemble predictions
• Neighborhood, scale-separation, feature-based, deformation
rain area
average rain
maximum rain
rain volume
21
New focus areas for
JWGFVR research
Extreme events
22
New focus areas for
JWGFVR research
Warnings, including timing
2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 0
Hit rate
0 1
Success ratio (1-FAR)
23
Thank you
24