Transcript Document

Joint ECMWF-University meeting on interpreting
data from spaceborne radar and lidar: AGENDA
09:30 Introduction
University of Reading activities
• 09:35 Robin Hogan - Overview of CloudSat/CALIPSO/EarthCARE work at University
• 09:50 Julien Delanoe - Ice cloud retrievals from CloudSat, CALIPSO & MODIS
• 10:05 Lee Smith
- Retrieval of liquid water content from CloudSat and CALIPSO
10:20-10:35 Coffee
ECMWF Activities
• 10:35 Marta Janiskova
• 10:50 Olaf Stiller
• 11:05 Richard Forbes
• 11:20 Maike Ahlgrimm
-
Overview of CloudSat/CALIPSO activities at ECMWF
Estimating representativity errors
ECMWF model cloud verification
Lidar derived cloud fraction for model comparison
11:35-12:30 Discussion
• Retrievals, forward models and error characteristics
• Verification of models
• Possibilities for collaboration
12:30 Lunch in the canteen
Recent CloudSat/CALIPSO/EarthCARErelated work at University of Reading
• Forward models and model evaluation
– Lidar forward modelling to evaluate the ECMWF model from IceSAT
– Multiple scattering model for spaceborne radar and lidar (Hogan)
• Retrievals and model evaluation
– LITE lidar estimates of supercooled water occurrence
– Radar retrievals of liquid clouds (Lee Smith, Anthony Illingworth)
– Variational radar-lidar-radiometer retrieval of ice clouds (Delanoe)
• ESA “CASPER” project (Clouds and Aerosol Synergy Products
from EarthCARE Retrievals)
– Defined the required cloud, aerosol and precipitation products
– Developed variational ice cloud retrieval for EarthCARE that uses the
cloud radar, the “High Spectral Resolution Lidar” (HSRL; the same
technology as ADM) and the infrared channels of the multispectral
imager
Ongoing/future work
• Forward models and model evaluation
– Use the CloudSat simulator to evaluate the 90-km resolution HiGEM
version of the Met Office climate model (Margaret Woodage)
– Use the CloudSat simulator to evaluate 1-km large-domain simulations of
tropical clouds in “CASCADE” (Thorwald Stein)
• Retrievals and model evaluation
– Ongoing comparisons with MO and ECMWF models (Smith & Delanoe)
– Use of retrievals to evaluate the CASCADE model (Thorwald Stein)
• CloudSat, CALIPSO and EarthCARE algorithm development
– Develop a “unified” retrieval algorithm for clouds, precipitation and
aerosols simultaneously using radar, lidar, infrared radiances and
possibly microwave radiances (Nicola Pounder, Hogan, Delanoe)
• Science questions
– What is the radiative impact of errors in model clouds? Use retrievals,
CERES observations and radiative transfer calcs. (Nicky Chalmers)
– What is the distribution of supercooled water in the atmosphere and
why is it so difficult to model? (Andrew Barrett)
ECMWF clouds vs IceSAT using a lidar forward model
Wilkinson, Hogan, Illingworth and Benedetti (Monthly Weather Review 2008)
• Cloud observations from IceSAT 0.5-micron
lidar (first data Feb 2004)
• Global coverage but lidar attenuated by thick
clouds: direct model comparison difficult
Lidar apparent backscatter coefficient (m-1 sr-1)
Optically thick liquid cloud obscures
view of any clouds beneath
Latitude
• Solution: forward-model the measurements
(including attenuation) using the ECMWF variables
ECMWF raw cloud fraction
Simulate lidar backscatter:
– Create subcolumns with max-rand overlap
– Forward-model lidar backscatter from
ECMWF water content & particle size
– Remove signals below lidar sensitivity
ECMWF cloud fraction after processing
IceSAT cloud fraction
Global cloud fraction comparison
ECMWF raw cloud fraction
ECMWF processed cloud fraction
• Results for October 2003
– Tropical convection peaks too high
– Too much polar cloud
– Elsewhere agreement is good
• Results can be ambiguous
– An apparent low cloud
underestimate could be a real
error, or could be due to high cloud
above being too thick
IceSAT cloud fraction
Examples of multiple scattering
• LITE lidar (l<r, footprint~1 km)
Stratocumulus
Apparent echo from
below the surface
Surface echo
Intense thunderstorm
CloudSat radar (l>r)
Fast multiple scattering forward model
Hogan and Battaglia (J. Atmos. Sci. 2008)
• New method uses the timedependent two-stream
approximation
• Agrees with Monte Carlo but
~107 times faster (~3 ms)
• Added to CloudSat simulator
CloudSat-like example
CALIPSO-like example
Combining radar and lidar…
Global-mean cloud fraction
Cloudsat radar
Radar
misses a
significant
amount of
ice
CALIPSO lidar
Preliminary target classification
Radar and lidar
Radar only
Lidar only
Insects
Aerosol
Rain
Supercooled liquid cloud
Warm liquid cloud
Ice and supercooled liquid
Ice
Clear
No ice/rain but possibly liquid
Ground
“Unified” retrieval framework
New ray of data: define state vector
Use classification to specify variables describing each species at each gate
• Ice: extinction coefficient and N0*
• Liquid: liquid water content and number concentration
• Rain: rain rate and mean drop diameter
• Aerosol: extinction coefficient and particle size
Radar model
Including surface return
and multiple scattering
Lidar model
Including HSRL channels
and multiple scattering
Compare to observations
Check for convergence
Converged
Proceed to next ray of data
(Black) Ingredients
already developed
(Delanoe and Hogan
JGR 2008)
(Red) Ingredients
remaining to be
developed
Radiance model
Solar and IR channels
Forward model
Not converged
Gauss-Newton iteration
Derive a new state vector
Mixed-phase clouds
• Supercooled water
layers have large
radiative impact
• Poorly modelled
LITE lidar showed
more supercooled
water in SH than NH
Two independent
methods from MODIS
show the same thing
Hogan et al. (GRL 2004)
What does
CALIPSO show?
What is the
explanation?
How can we
model mixedphase clouds?
Discussion points
• Is the intention to assimilate cloud radar and lidar directly?
– If so, are fast radar and lidar forward models of interest?
• If retrievals are to be assimilated, what variables are needed?
• Do you need error covariances, averaging kernels and
information content? Straightforward to calculate, but:
– Complicated to store (state vector is a different size for each profile)
– Increases the data volume by an order of magnitude
• What are best diagnostics for assessing model performance?
– Means, PDFs, skill scores…
• ECMWF model variables are required by retrievals
– What is the error of model temperature, pressure and humidity?
CloudSat simulator (Bodas et al)
• Simulated radar
reflectivity from
sub-grid model
• Simulated radar
reflectivity
averaged to
model grid
– How would this
look with high-res
model?
• Observed
CloudSat radar
reflectivity
Example of mid-Pacific convection
MODIS 11 micron channel
Height (km)
Height (km)
Retrieved extinction (m-1)
CloudSat radar
CALIPSO lidar
Deep convection penetrated only by radar
Cirrus detected only by lidar
Mid-level
liquid
clouds
Time since start of orbit (s)
Supercooled water in models
• A year of data from the Met Office and ECMWF
– Easy to calculate occurrence of supercooled water with  > 0.7
Prognostic ice and
liquid+vapour
variables
Prognostic
cloud water:
ice/liquid
diagnosed from
temperature