ATMOSPHERIC PROCESSES in SPACE-ATMOSPHERE-SEA/LAND system Submodels WMO WEATHER FORECASTING RANGES Nowcasting A description of current weather parameters and 0 forecasted weather parameters Very short-range Up to 12 hours.

Download Report

Transcript ATMOSPHERIC PROCESSES in SPACE-ATMOSPHERE-SEA/LAND system Submodels WMO WEATHER FORECASTING RANGES Nowcasting A description of current weather parameters and 0 forecasted weather parameters Very short-range Up to 12 hours.

ATMOSPHERIC PROCESSES
in SPACE-ATMOSPHERE-SEA/LAND system
Submodels
WMO WEATHER FORECASTING RANGES
Nowcasting
A description of current weather parameters and 0
forecasted weather parameters
Very short-range
Up to 12 hours description of weather parameters
Short-range
Beyond 12 hours and up to 72 hours description of weather parameters
Medium-range
Beyond 72 hours and up to 240 hours description of weather parameters
Extended-range
Beyond 10 days and up to 30 days description of weather parameters, usually
averaged and expressed as a departure from climate values for that period.
Long-range
Monthly outlook
From 30 days up to two years
Description of averaged weather parameters expressed as a departure (deviation,
variation, anomaly) from climate values for that month (not necessarily the
coming month).
Description of averaged weather parameters expressed as a departure from
climate values for that 90 day period (not necessarily the coming 90 day period).
Description of averaged weather parameters expressed as a departure from
climate values for that season.
Three month or 90
day outlook
Seasonal outlook
Climate forecasting
Climate variability
prediction
Climate prediction
-2 hours description of
Beyond two years
Description of the expected climate parameters associated with the variation of
inter-annual, decadal and multi-decadal climate anomalies.
Description of expected future climate including the effects of both natural and
human influences.
SYNOP
AIROCRAFS
AIROLOGICAL DATA
RADARS,
r = 100km
Radar
The system of equations
(conservation laws applied to individual parcels of air)
(from E.Kalnay)
V. Bjerknes (1904) pointed out for the first time that there is a complete set of 7
equations with 7 unknowns that governs the evolution of the atmosphere:
•
•
•
•
•
conservation of the 3-dimensional momentum (equations of motion),
conservation of dry air mass (continuity equation),
the equation of state for perfect gases,
conservation of energy (first law of thermodynamics),
equations for the conservation of moisture in all its phases.
They include in their solution fast gravity and sound waves, and therefore in
their space and time discretization they require the use of smaller time
steps, or alternative techniques that slow them down. For models with a
horizontal grid size larger than 10 km, it is customary to replace the vertical
component of the equation of motion with its hydrostatic approximation, in
which the vertical acceleration is neglected compared with gravitational
acceleration (buoyancy). With this approximation, it is convenient to use
atmospheric pressure, instead of height, as a vertical coordinate.
da va
F/m
dt
du
 p
u

 F  (2 
)(v sin   w cos  )
dt
r cos  
r cos 
dv
 p
u
vw

 F  (2 
)u sin  
dt
r 
r cos 
r
dw
p
u
v2
 
 g  Fr  (2 
)u cos  
dt
r
r cos 
r

 .(  v )
t
 p0 
  T  
 p 
 q
t
R
Cp
p  RT
dT
dp
Q  Cp

dt
dt
ds
1 d Q
 Cp

dt
 dt T
 .(  vq )   ( E  C )
2003, December
ECMWF: T511L60 – 40 km; EPS: T255L60 – 80 km;
DWD:
GME (L41) – 40 km; LM (L3550) – (2.8)7 km;
France: ARPEGE(L41)-23-133km; ALADIN (L41)– 9 km;
HIRLAM: -------------(L16-31)
– 5-55 km;
UK:
UM(L30)
– 60 km; (L38)
– 12 km;
USA: AVP (T254L64) – 60 km; ETA (L60)
– 12 km;
Japan:
GSM(L40) – 60 km; MSM(L40) – 10 km.
RusFed.: T85L31
– 150 km;
(L31)
– 75 km.
Moscow region (300kmx300km) - 10 km.
FEATURES OF INFORMATION AND COMPUTATIONAL TECHNOLOGIES
IN ATMOSPHERIC SCIENCES
Coordinate systems: p, sigma, z, eta, hybrid
Models of atmosphere:
Steps:
Methods:
global - 40-60 km, local 7-12 km;
splitting, semi-Lagrangian scheme (23),
ensembles, nonhydrostatic, grids
Data assimilation:
3(4)D-Var, Kalman filter
Reanalyses
NCEP / NCAR USA
50-years (1948-…; T62L28~210km)
Reanalyses-2 (ETA RR 32 km, 45 layers)
ECMWF ERA-15 (TL106L31~150km, 1979-1993),
ERA-40 (TL159L60~120km, 3D-Var, mid1957-2001)
Modern and Possible further development computational technologies
ensemble simulation
One method (which used by ECMWF forecast system) based on the finding
rand  g with help of the part of the eigenvectors of the linear operator L
which received after linearization of the operator N from finite-difference scheme
of the system of the using forecasting thermo- hydrodinamic equations
 hj 1   hj  N ( hj ) ,
where  hj is grid vector-function  h  (uh , vh , wh . ph , Th ,...) , other
notations in this formula are usual.
Plus of this method is good physical meaning but minus consist in first of all in
necessary finding eigenvectors of the linearization L and then barest necessity of
the making sufficiently big quality of the additional forecasts.
j
j
j
j
j
j
T
ECMWF:
FORECASTING SYSTEM - DECEMBER 2003
Model:
Smallest half-wavelength resolved:
40 km (triangular spectral truncation 511)
Vertical grid:
60 hybrid levels (top pressure: 10 Pa)
Time-step:
15 minutes
Numerical scheme: Semi-Lagrangian, semi- implicit time-stepping formulation.
Number of grid points in model:
20,911,680 upper-air, 1,394,112 in land surface and sub- surface layers. The grid for
computation of physical processes is a reduced, linear Gaussian grid, on which single- level
parameters are available. The grid spacing is close to 40km.
Variables at each grid point (recalculated at each time-step):
Wind, temperature, humidity, cloud fraction and water/ ice content, ozone content (also pressure
at surface grid-points)
Physics:
orography (terrain height and sub-grid-scale), drainage, precipitation, temperature, ground
humidity, snow-fall, snow-cover & snow melt, radiation (incoming short-wave and out-going longwave), friction (at surface and in free atmosphere), sub-grid-scale orographic drag - gravity
waves and blocking effects, evaporation, sensible & latent heat flux, oceanic waves.
ECMWF:
FORECASTING SYSTEM - DECEMBER 2003
Data Assimilation:
Analysis:
Mass & wind (four-dimensional variational multi- variate analysis on 60 model levels)
Humidity (four-dimensional variational analysis on model levels up to 250 hPa)
Surface parameters (sea surface temperature from NCEP Washington analysis, sea
ice from SSM/I satellite data), soil water content, snow depth, and screen level
temperature and humidity
Data used:
Global satellite data (SATOB/AMV, (A)TOVS, Quikscat, SSM/I, SBUV, GOME,
Meteosat7 WV radiance),
Global free-atmosphere data (AIREP, AMDAR, TEMP, PILOT, TEMP/DROP,
PROFILERS),
Oceanic data (SYNOP/SHIP, PILOT/SHIP, TEMP/SHIP, DRIBU),
Land data (SYNOP). Data checking and validation is applied to each parameter
used. Thinning procedures are applied when observations are redundant at the model
scale.
 the Penn State/NCAR Mesoscale Model (e.g., Dudhia, 1993),
 the CAPS Advanced Regional prediction System (Xue et al, 1995),
 NCEP's Regional Spectral Model (Juang et al, 1997),
 the Mesoscale Compressible Community (MCC) model
(Laprise et al, 1997),
 the CSU RAMS Tripoli and Cotton (1980),
 the US Navy COAMPS (Hodur, 1997).
WRF Development Teams
Numerics and
Software
(J. Klemp)
WG1
Working Groups
Dynamic Model
Numerics
(W. Skamarock)
Data
Assimilation
(T. Schlatter)
WG3
Analysis and
Validation
(K. Droegemeier)
WG6
Standard
Initialization
(J. McGinley)
Analysis and
Visualization
(L. Wicker)
Workshops,
Distribution,
and Support
(J. Dudhia)
WG7
WG5
Model Physics
(J. Brown)
WG4
WG2
Software
Architecture,
Standards, and
Implementation
(J. Michalakes)
Courtesy NCAR
3-D Var
(J. Derber)
WG10
Advanced
Techniques
(D. Barker)
Community
Involvement
(W. Kuo)
WG8
Model Testing
Atmospheric
and Verification
Chemistry
WG11
(C. Davis)
(P. Hess)
Land Surface
WG14
Models
WG13
(J. Wegiel)
Ensemble
Regional Climate
Forecasting
WG16Modeling
(D. Stensrud)
(proposed)
Operational
Implementation
(G. DiMego)
WG12
Data Handling
and Archive
(G. DiMego)
WG9
Operational
Requirements
(G. DiMego)
WG15
Operational
Forecaster
Training
(T. Spangler)
Model Physics in High Resolution NWP
Physics
“No Man’s Land”
1
Resolved Convection
3-D Radiation
LES
10
100
km
Cumulus Parameterization
Two Stream Radiation
PBL Parameterization
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Weather Research and Forecasting Model
Goals: Develop an advanced mesoscale forecast
and assimilation system, and accelerate
research advances into operations
36h WRF Precip Forecast
• Collaborative partnership, principally among NCAR, NOAA,
DoD, OU/CAPS, FAA, and university community
• Multi-agency WRF governance; development conducted
by 15 WRF Working Groups
• Software framework provides portable, scalable code with
plug-compatible modules
Analyzed Precip
• Ongoing active testing and rapidly growing community use
– Over 1,400 registered community users, annual
workshops and tutorials for research community
– Daily experimental real-time forecasting at NCAR ,
NCEP, NSSL, FSL, AFWA, U. of Illinois
• Operational implementation at NCEP and AFWA in FY04
27 Sept. 2002
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel
NOAA –17 AVHRR 13 Sep 03 14:48 GMT
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel Track
18/1700Z
4 km WRF
Initialized 17/0000Z
10 km WRF
Initialized 15/1200Z
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel 3 h Precip Forecast
WRF Model
10 km grid
5 day forecast
Initialized:
12 UTC 15 Sep 03
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
48 h Hurricane Isabel Reflectivity Forecast
Initialized 00 UTC 17 Sep 03
Radar Composite
4 km WRF forecast
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel Reflectivity at Landfall
18 Sep 2003 1700 Z
Radar Composite
41 h forecast from 4 km WRF
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Hurricane Isabel Surface-Wind Forecast
WRF Model
4 km grid
2 day forecast
Initialized:
00 UTC 17 Sep 03
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
WRF Mass Coordinate Core
•
•
•
•
Terrain-following hydrostatic pressure vertical coordinate
Arakawa C-grid, two-way interacting nested grids (soon)
3rd order Runge-Kutta split-explicit time differencing
Conserves mass, momentum, dry entropy, and scalars
using flux form prognostic equations
• 5th order upwind or 6th order centered differencing
for advection
• Physics for CR applications: Lin microphysics, YSU PBL,
OSU/MM5 LSM, Dudhia shortwave/RRTM longwave
radiation, no cumulus parameterization
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
Model Configuration for 4 km Grid
• Domain
– 2000 x 2000 km, 501 x 501 grid
– 50 mb top, 35 levels
– 24 s time step
• Initialization
– Interpolated from gridded analyses
– BAMEX: 40 km Eta CONUS analysis
– Isabel: 1o GFS global analysis (~110 km)
• Computing requirements
– 128 Processors on IBM SP Power 4 Regatta
– Run time: 106 min/24h of forecast
From Joe Klemp, NCAR (Bad Orb, 23-27.10.03 2003)
North American Early Guidance
System
Prediction Model
Data Assimilation
Date
12 km Meso Eta 12 km 3DVAR radial velocity
9/30/2002
10 km Meso Eta
improved physics
10 km hourly update & improved
background error cov.
2/28/2004
9 km NMM top @
2mb hourly output
9 km AIRS, GOES imagery & move
top to 2mb
5/31/2005
8 km WRF
8 km WRF 4DDA
5/31/2006
7 km WRF
improved physics
7 km absorption scattering in
radiative transfer
5/31/2008
6 km WRF
6 km aerosols in radiative
aerosols
transfer & reflectivity
5 km WRF L100 5 km NPP, advanced 4DDA,
NPOESS, IASI & air quality
5/31/2009
5/31/2010
Global Forecast System
(GFS)
Prediction Model
Data Assimilation
Date
T-254 / L64
3D-VAR, AMSU-B, Quikscat
9/30/2002
T-254 / L64 add 2
passive tracers
Grid point version, AIRS,
2/28/2004
45 km / L64
3-D Background error covariance,
cloud analysis, minimization
5/31/2005
45 km / L64 +
improved
microphysics
Absorption / scattering in
5/31/2006
40 km / L80
Aerosols in radiative transfer, GIFTS 5/31/2008
GOES imagery
radiative transfer
NPP, integrated SST analysis
40 km / L80
35 km / L100 Advanced 4DDA, NPOESS,
IASI + air quality
5/31/2009
5/31/2010
Timeline for WRF at NCEP
• North American WRF: Operational in
FY05
• Hurricane WRF: Operational in FY06
• Rapid Refresh (RUC) WRF (hourly):
Operational in FY07
• WRF SREF : Operational in FY07
• Others? (Fire Wx, Homeland Security,
etc.) using best WRF deterministic model
http://www.metoffice.com/research/nwp/numerical/unified_model/index.html
The Unified Model
The Unified Model is the name given to the suite of atmospheric and oceanic numerical modelling
software developed and used at the Met Office.
The formulation of the model supports global and regional domains and is applicable to a
wide range of temporal and spatial scales that allow it to be used for both numerical
weather prediction and climate modelling as well as a variety of related research activities.
The Unified Model was introduced into operational service in 1992.
Since then, both its formulation and capabilities have been substantially enhanced.
New Dynamics
A major upgrade to the Met Office Global Numerical Weather Prediction model was implemented
on 7th August 2002.
Submodels
The Unified Model is made up of a number of numerical submodels representing different aspects of
the earth's environment that influence the weather and climate.
Like all coupled models the Unified Model can be split up in a number of different ways,
with various submodel components switched on or off for a specific modelling application.
The Portable Unified Model (PUM)
A portable version of the Unified Model has also been developed suitable for running
on workstations and other computer systems.
http://www.metoffice.com/research/nwp/numerical/unified_model/new_dynamics.html
The Met Office Global Numerical Weather Prediction model
was implemented on 7th August 2002.
The package of changes was under trial for over a year and is known as "New Dynamics".
This document details some of the key changes that are part of the New Dynamics package.
 Non-hydrostatic model with height as the vertical co-ordinate.
 Charney-Philips grid-staggering in the vertical,
 Arakawa C-grid staggering in the horizontal,
 Two time-level, semi-Lagrangian advection and semi-implicit time stepping.
 Edwards-Slingo radiation scheme with non-spherical ice spectral files
 Large-scale precipitation includes prognostic ice microphysics.
 Vertical gradient area large-scale cloud scheme.
 Convection with convective available potential energy (CAPE) closure,
momentum transports and convective anvils.
 A boundary-layer scheme which is non-local in unstable regimes.
 Gravity-wave drag scheme which includes flow blocking.
 GLOBE orography dataset.
 The MOSES (Met Office Surface Exchange Scheme) surface hydrology
and soil model scheme.
Predictor-corrector technique with no extraction of basic state profile.
Three-dimensional Helmholtz-type equation solved using GCR technique.
The operational forecast system at Météo-France is based
on two different numerical applications of the same code
1. ARPEGE-IFS,
2. additional code to build the limited area model ALADIN.
The ARPEGE-IFS has been developed jointly by Météo-France
and ECMWF (ARPEGE is the usual name in Toulouse and IFS - in Reading):
ECMWF model for medium range forecasts (4-7 days)
a Toulouse variable mesh version in for short range predictions (1-4 days)
The ALADIN library has been developed jointly by Météo-France
and the national meteorological or 14 hydrometeorological services:
Austria, Belgium, Bulgaria, Croatia, Czech Republic, Hungary,
Moldova, Morocco, Poland, Portugal, Romania,
Slovakia,
Slovenia, Tunisia.
32535
325
40(35)
 the hydrostatic model ,
 41(31) layers and horizontal resolution ~ 40(60) km,
 prognostic equations: horizontal wind components, temperature, specific
humidity, specific cloud water content and surface pressure,
 physical processes: a comprehensive representation of the precipitation
process, a massflux convection parameterisation, a radiation model with
cloud-radiation interaction, turbulent exchange in the free atmosphere
based on a level 2 scheme, surface layer fluxes based on a bulk approach,
a two layer soil model including energy and mass budget equations for
snow cover and the representation of sub-grid scale orographic effects,
 the topography of the earth's surface.
 nonhydrostatic model,
 resolution ~ 2.8 (7) km,
 GME + 3 additional
prognostic equations:
vertical wind speed,
pressure deviation,
turbulent kinetic energy (TKE),
 the vertical turbulent diffusion (2.5 scheme),
a laminar sublayer at the earth's surface.
Forecast variables
Data supply from DWD’s
LM or GME forecast models
f ( ,  , p / z, t )
u, v, / w, ps
Numerical scheme
Euler-Cauchy with iteration
Interpolation
1st order in time,
2nd or 3rd order in space.
Daily routine (ca. 1500 trajectories)
1. LM trajectories (7 km, Central and western Europe):

48h forward trajectories for 36 nuclear and chemical installations.
2. GME trajectories (60km resolution, worldwide):
 120h forward trajectories for 60 European nuclear sites,
 120h backward trajectories for 37 German radioactivity measuring sites,
 backward trajectories for the international GAW stations,
 backward trajectories for 5 African cities in the METEOSAT-MDD program, disseminated daily via satellite from Bracknell,
 backward trajectories for the German research polar stations Neumayer (Antarctica) and Koldewey (Spitzbergen) and the research ships 'Polarstern' and
'Meteor'.
Operational emergency trajectory system
(Trajectory system for scientific investigations)
1. LM or GME trajectory models
2. Data supply from LM or GME forecasts or analyses from current
database or archives
3. Foreward and backward trajectories for a choice of offered or
freely eligible stations at optional heights and times in the current
time period of 7 - 12 days
4. Interactive menue to be executed by forecasters, operational 24h.
Further Development
of the Local Systems LME and LMK 2003 to 2006
LME: Local model LM for whole of Europe; mesh size 7 km and 40
layers; 78-h forecasts from 00, 12 and 18 UTC.
LMK: LM-”Kürzestfrist”; mesh size < 3 km and 50 layers; 18-h forecasts
from 00, 03, 06, 09, 12, 15, 18, 21 UTC for Germany with explicit
prediction of deep convection.
1. Data assimilation
• 2 Q 2005
Use satellite (GPS) and radar data (reflectivity, VAD winds)
• 1 Q 2006
Use European wind profiler and satellite data
Further Development of the Local Systems LME and
LMK 2003 to 2006
2. Local modelling
• 2 Q 2004
Increase model domain (7 km mesh) from 325x325 up to
753x641 gridpoints (covering whole of Europe), 40 layers
• 3 Q 2005
New convection scheme (Kain-Fritsch ?)
Europa
LMK: LM-Kürzestfrist
Model-based system for nowcasting
and very short range forecasting
Goals:
Prediction of severe weather on the mesoscale.
Explicit simulation of deep convection.
Method:
18-h predictions of LM initialised every three hours,
mesh size < 3 km
Usage of new observations:
SYNOP:
Every 60 min,
GPS:
Every 30 min,
Reflectivity:
Every 15 min,
Aircraft data.
METAR: Every 30 min,
VAD winds:
Every 15 min,
Wind profiler:
Every 10 min,
+18h
+15h
+12h
+9h
+6h
+3h
00
03
06
09
12
15
18
21
LMK: A new 18-h forecast every three hours
00
(UTC)
High-resolution Regional Model HRM
•
•
•
•
•
•
•
•
Operational NWP Model at 13 services worldwide
Hydrostatic, (rotated) latitude/longitude grid
Operators of second order accuracy
7 to 28 km mesh size, various domain sizes
20 to 35 layers (hybrid, sigma/pressure)
Prognostic variables: ps, u, v, T, qv, qc, qi
Same physics package as GME
Programming: Fortran90, OpenMP/MPI for
parallelization
• From 00 and 12 UTC: Forecasts up to 78 hours
• Lat. bound. cond. from GME at 3-hourly intervals
General structure of a regional NWP system
Graphics
Visualization
Topographical
data
Initial data
(analysis)
Lateral
boundary data
Regional
NWP
Model
Direct model
output (DMO)
MOS
Kalman
Applications
Wave model,
Trajectories
Verification
Diagnostics
Short Description of the
High-Resolution Regional Model (HRM)
Hydrostatic limited-area meso- and meso-
scale numerical weather prediction model
Prognostic variables
Surface pressure
Temperature
Water vapour
Cloud water
Cloud ice
Horizontal wind
Several surface/soil
parameters
ps
T
qv
qc
qi
u, v
Diagnostic variables
Vertical velocity

Geopotential

Cloud cover
clc
Diffusion coefficients tkvm/h
Current operational users of the HRM
• Brazil, Directorate of
Hydrography & Navigation
• Brazil, Instituto Nacional
de Meteorologia
• Bulgaria, National
Meteoro-logical &
Hydrological Service
• China, Guangzhou
Regional Meteorological
Centre
• India, Space Physics Lab.
• Israel, Israel
Meteorological Service
• Italy, Italian Meteorological
Service
• Kenya, National
Meteorological Service
• Oman, National Meteorological Service (DGCAM)
• Romania, National
Meteoro-logical &
Hydrological Service
• Spain, National Met.
Institute
• United Arab Emirates,
National Met. Institute
• Vietnam, National Meteorological & Hydrological
Service; Hanoi University
Numerics of the HRM
• Regular or rotated latitude/longitude grid
• Mesh sizes between 0.25° and 0.05° (~ 28 to 6 km)
• Arakawa C-grid, second order centered
differencing
• Hybrid vertical coordinate, 20 to 35 layers
• Split semi-implicit time stepping; t = 150s at  =
0.25°
• Lateral boundary formulation due to Davies
• Radiative upper boundary condition as an option
• Fourth-order horizontal diffusion, slope
correction
• Adiabatic implicit nonlinear normal mode
initialization
Physical parameterizations of the HRM
• -two stream radiation scheme (Ritter and Geleyn, 1992)
including long- and shortwave fluxes in the atmosphere
and at the surface; full cloud - radiation feedback;
diagnostic derivation of partial cloud cover (rel. hum. and
convection)
• Grid-scale precipitation scheme including parameterized
cloud microphysics (Doms and Schättler, 1997)
• Mass flux convection scheme (Tiedtke, 1989)
differentiating between deep, shallow and mid-level
convection
• Level-2 scheme of vertical diffusion in the atmosphere,
similarity theory (Louis, 1979) at the surface
• Two-layer soil model including snow and interception
storage; three-layer version for soil moisture as an option
Computational aspects of the HRM
• Fortran 90 and C (only for Input/Output: GRIB code)
• Multi-tasking for shared memory computers based on
standard Open-MP
• Efficient dynamic memory allocation
• NAMELIST variables for control of model
• Computational cost: ~ 3100 Flop per grid point, layer
and time step
• Interface to data of the global model GME available
providing initial and/or lateral boundary data
• Build-in diagnostics of physical processes
• Detailed print-out of meteographs
Total wallclock time (min for 24 h) for HRM - Africa (361x321, 31 layers, 28 km)
on an IBM RS/6000 SP
300
250
242,57
t(min)
200
150
128,00
100
97,51
73,19
62,93
50
53,48
47,63
42,87
0
0
2
4
6
8
10
nproc
12
14
16
18
Total Wallclock time (min)
Time distribution (%) of the main processes of HRM on an IBM RS/6000 SP
9,8
2,4
4,3
Start up of HRM
3,6
3,2
l.b.c. update
Diabatic processes
4,1
Explicit forecast
SI Scheme
7,9
39,5
Asselin filtering
Condensation/evaporation
Diagnostics/meteographs
Post-processing GRIB files
25,2
Further Development of the HRM 2003 to 2006
• An MPI version of HRM for Linux PC Clusters, developed by
Vietnam, is available to all HRM users since July 2003.
• A 3D-Var data assimilation scheme developed by Italy will be
available to experienced HRM users early 2004.
• The physics packages in GME and HRM will remain exactly
the same.
• The interaction between the different HRM groups should be
intensified.
• A first HRM User’s Meeting will take place in Rio de Janeiro
(Brazil) in October 2004.
WL|Delft,
RIZA
DMI
SMHI
DWD
GRDC
Univ Lancaster
Univ. Bristol
ECMWF
JRC Ispra
Univ. Bologna
1)
Run the complete assimilation-forecast system for GME and LM for the three
historical flood events for a period of roughly 2 weeks for each flood event.
2)
Perform for the three flood events high resolution analyses
of 24h precipitation heights on the basis of surface observations.
3)
Develop a prototype-scheme for near real-time 24h precipitation analysis
on the basis of Radar-data and synoptic precipitation observations.
In addition to these tasks the operational model results according to task 1)
for the period of the Central European flood were retrieved from the archives
and supplied to the project ftp-server.
Deutscher Wetterdienst (DWD)
meteorological data set
for the development of a flood forecasting system
DWD prepared data sets which include all meteorological fields necessary as
input fields to hydrological models. Four flood cases in different European
river basins for different seasons (autumn, winter and summer) were
investigated:
a) Po
– 1994, November, Autumn,
b) Rhine, Meuse – 1995, January,
Winter,
c) Odra
– 1997, July,
Summer,
d) Elbe
– 2002, August,
Summer.
The fields are based on the analysis of observed precipitation and on model
forecasts:
 48 h forecasts by DWD's limited area model LM (ca. 7 km resolution,
model area is Central Europe, data provided at hourly intervals);
 156 h forecasts by DWD's global model GME (model resolution ca. 60
km, data provided at 6 hourly intervals on a 0.75 o  0.75 o grid with NWcorner at 75 o N, 35 o W and SE-corner at 30 o N, 45 o E);
 analyses of 24 h precipitation observations
for the LM area in ca. 7 km resolution.
Maps of the constant fields for GME and LM.
PRECIPITATION DISTRIBUTION (kg/m2)
for 05 Nov, 1994, 06 UTC to 06 Nov, 1994, 06 UTC:
a)
a) analysis based
on synoptic (631) stations
b)
b) analysis based
on synoptic (631)
and MAP (5173) stations
c) LM model prediction
(18 to 42 hours forecast).
c)
•
•
•
•
•
•
Austria
Czech Republic
Germany
Poland
Switzerland
Alltogether
263
800
4238
1356
435
7092
2002
ECMWF
DWD
NCEP
JMA Japan
CMA China
HMC Russia
2003
0.96 Tf
TL511 (40km) L60
10 Tf
1.92 Tf
60km L31
7 km L35
2.88 Tf
7.3 Tf
T170(80km) L42
12km L60
0.768 Tf
T106(120km) L40
20 / 10 km L40
2004
2005
2006
20 Tf TL511(40km) L60
TL799(25km) L91
18-28 Tf
30km L45
40km L40
7 / 2 km L35
15.6 Tf
T254(50km) L64
TL611(40km) L42
8 km
6 Tf
TL319(60km) L42
28 Tf 2007:
G 30km
L 5 km
20 Tf 2007:
TL959(20km) L60
5 km L50
0.384 Tf
T213(60km) L31
25 km L20
3.84 Tf ?
35 Gf
T85(150km) L31
75 km L30
T Tf ?
T169(80km) L31
15 km
2008: 5 km
ECMWF:
EQUIPMENT IN USE (end of 2003)
Computer equipment being readied for operational use
Machine
2 IBM Cluster 1600
5 x IBM p660 nodes
3 HP K580 mashines
Processors Memory
1820
26
18
2500 GB
40 GB
22 GB
Storage
Tape
12 TB
20 TB
0.4 TB
73
Central Computer System
(CCS)
Phase / Processors
Clock Speed
Date
Current
2001
Phase I
9/2002
2432
375MHz
1408
1.3GHz
2752
Phase II
6/2004 1.8+1.3GHz
Memory
1216
MB
1408
MB
2752
MB
Disk
Space
30 TB
Tape
Storage
200 TB
42 TB
1250 TB
84 TB
2500 TB
But what are we going to do if we have not CCS?
Result of V.Galabov (Bulgaria) experiments
with different PC
LINUX (Red Hat 7.3)
PGI Workstation 4.0 (Portland Group Fortran and C++)
HRM DWD (hydrostatic High Resolution Model)
93 x 73, 31 Layers,
0.1250 grid spacing (14 km),
forecast for 48 hours
AMD Duron 1300MHz
AMD Athlon XP 1800+ MHz
Pentium 4 2.4 GHz
Intel Xeon Workstation
1 processor 2.4 GHz
2 processors 2.4 GHz
384 Mb PC 133 SDRAM
256 Mb DDR266 RAM
512 Mb DDR333 SDRAM
96 min
81 min
70 min
2048 Mb RDRAM PC 800
2048 Mb RDRAM PC 800
60 min
33 min
program TestOMP
integer k, n, tid, nthreads, max_threads, procs
logical dynamic, dynamic
double precision d (5000)
=====
call gettim (hrs1,mins1,secs1,hsecs1) call getdat (year,month,day)
max_threads = OMP_GET_MAX_THREADS()
procs
= OMP_GET_NUM_PROCS()
dynamic
= OMP_GET_DYNAMIC()
nested
= OMP_GET_NESTED()
!$OMP PARALLEL PRIVATE (NTHREADS, tid, n, k)
tid
= OMP_GET_THREAD_NUM()
nthreads
= OMP_GET_NUM_THREADS()
!$OMP DO SCHEDULE (STATIC, 5000)
do n = 1 , 10000
do k = 1, 5000
d(k) = sin (dble(k+n))**2 + cos (dble(k+n))**2
end do
end do
!OMP END DO
!$OMP END PARALLEL
===== call gettim (hrs2,mins2,secs2,hsecs2)
call getdat (year,month,day)
Pentium 4
3.06 GHz;
OS
2 Gb DDR DIMM PC3200;
BIOS
Compiler
120 Gb Seagate
OpenMP
Time
Windows XP
Threads
DISABLE
Visual
Fortan 6.5
-
3.59 s
Windows XP
Hyper
Threadings
Visual
Fortan 6.5
-
3.63 s
& -
3.59 s
+
2.38 s
Threads
Linux
(Mandrake9.2) DISABLE
Intel
Fortran 8.0
Linux
Hyper
(Mandrake9.2) Threadings
Intel
Fortran 8.0
+
The future (from E.Kalnay)
An amazing improvement in the quality of the forecasts based on NWP guidance.
From the active research currently taking place, one can envision that the next decade will continue to
bring improvements, especially in the following areas:
• Detailed short-range forecasts, using storm-scale models able to provide skillful
predictions of severe weather.
• More sophisticated methods of data assimilation able to extract the maximum possible
information from observing systems, especially remote sensors such as satellites
and radars.
• Development of adaptive observing systems, where additional observations are placed
where ensembles indicate that there is rapid error growth (low predictability).
• Improvement in the usefulness of medium-range forecasts, especially through the use
of ensemble forecasting.
• Fully coupled atmospheric-hydrological systems, where the atmospheric model
precipitation is appropriately downscaled and used to extend the length of river flow
prediction.
• More use of detailed atmosphere-ocean-land coupled models, where the effect of long
lasting coupled anomalies such as SST and soil moisture anomalies leads to more
skillful predictions of anomalies in weather patterns beyond the limit of weather
predictability (about two weeks).
• More guidance to government and the public on areas such as air pollution, UV
radiation and transport of contaminants, which affect health.
• An explosive growth of systems with emphasis on commercial applications of NWP,
from guidance on the state of highways to air pollution, flood prediction, guidance to
agriculture, construction, etc.
1. Observing system
2. Telecommunication system
3. Computer system
4. Data assimilation
5. Model
6. Postprocessing