Diapositiva 1

Download Report

Transcript Diapositiva 1

Sensing the Visibility Range
at Low Cost in the SAFESPOT
Road Side Unit
Nicolas Hautière1, Jérémie Bossu1, Erwan Bigorgne1,
Nicolas Hiblot2, Adberrahmane Boubezoul1, Benoit Lusetti2,
Didier Aubert2
1. LEPSiS, INRETS/LCPC, Univ. Paris-Est, France
2. LIVIC, INRETS/LCPC, France
ITS World Congress, Stockholm, Sweden
Overview of the system
2

The proposed system is a data-chain which produces environmental
information in the SF Local Dynamic Map based on the detection of
meteorological events (rain, fog, black ice, wet road) by one or several
sensors of the SAFESPOT Road Side Unit.

It refines these events, or may create a new event, by combining the outputs
of the different sensors, in particular CCTV cameras.

By querying the status of vehicle actuators with respect to their past locations,
the component is also able to extend or reduce the detection area of this
environmental event.

The information is prone to be used in ‘Hazard &Incident Warning’ and ‘Speed
Alert’ applications.
ITS World Congress, Stockholm, Sweden
The SAFESPOT Road Side Unit
LDM API
Data Fusion
TCP/IP
R/S
sensing
systems
SF
vehicles
Situation
Refinement
Data
Rec.
LDM
Server
Applications
traffic event/accident,
weather/road status,
vehicle manoeuvres, etc
UDP
Object
Refinement
Message
Generation
GPS
Legacy
systems
SP2 Framework
Data
sources
3
VANET
Router
Legacy
systems
SF
vehicles
ROADSIDE UNIT
Responsibility
SP2
Gateway
SP3
SP5
SP4
ITS World Congress, Stockholm, Sweden
The Local Dynamic Map
Real time map of vehicle surroundings
with static and dynamic safety information
Vehicles
Road
side unit
Ego
Vehicle
Congestion
Temporary
regional info
Landmarks for
referencing
!
Tree
Accident
Map from
provider
4
ITS World Congress, Stockholm, Sweden
Daytime
Fog
Road visibility

Based on the French standard NF-P-99-320

The SF system shall detect visibilities below 400m
 The SF system should assign the low visibilities to one of the four categories

5
Visibility range
Visibility distance [m]
1
200 to 400
2
100 to 200
3
50 to 100
4
<50
The system should detect the origin of the visibility reduction: fog, hydrometeors
ITS World Congress, Stockholm, Sweden
Data sources: CCTV for Visibility (1/3) – Overview

Technology



The sensing system aims to detect,
classify critical weather conditions
(dense fog, hard rain showers) and
estimate the visibility range through
use of classical CCTV cameras
Detection software


Camera used: DALSA Genie M-1400
Resolution 1392 x 1040 with sensor 1/2"
Pixels : 4.65µm* 4.65µm - 15 im/s

6
A background modeling approach,
based on a mixture of Gaussians, is
used to separate the foreground from
the background.
Since fog is steady weather, the
background image is used to detect
and quantify it. Since rain is a dynamic
phenomenon, the foreground is used to
detect.
Compatible with existing videosurveillance solutions
Functionality
Operation range
Accuracy
Fog presence
Day and night
100% by day
Fog intensity
Day
>90% by day
Visibility range
Day and night
>90% by day
Rain presence
Day
95%
ITS World Congress, Stockholm, Sweden
Data sources: CCTV for Visibility (2/3) – Fog detection
Fog detection
+
Meteorological
visibility
estimation
Vmet
Original
sequence
Driving
space area
7
Mobilized
visibility distance
Vmob
ITS World Congress, Stockholm, Sweden
Data sources: CCTV for Visibility (3/3) – Hydrometeors detection
8
Original
sequence
Segmentation
Detection
Classification
ITS World Congress, Stockholm, Sweden
Situation refinement of visibility range

Data fusion at RSU level:

Possible other data sources:

Fog presence identified by CCTV
camera
 Confirmation by weather station
 Combination of different sensor
outputs to compute a single visibility
range descriptor

At road network level:

Visibility range is spatial barycenter of
different sensors outputs
 The corresponding uncertainty is the
sum of:



9
The uncertainty of the sensors
themselves
The uncertainty coming from the
distance to the data sources
The uncertainty coming from the
status of fog lamps on the road section
Mobile fog sensor
Visibility meter
Fog lamps status
ITS World Congress, Stockholm, Sweden
Situation refinement of visibility range

Results on LCPC test track
Meteorological visility map
 SAFESPOT camera
 in-vehicle camera
 Fog lights on
 Fog lights off
Uncertainty map
10
ITS World Congress, Stockholm, Sweden
Conclusion and perspectives




The performances of the detection
modules are good, despite a lack of
ground truth data. A more systematic
evaluation should be carried out.
A general framework to fuse different
visibility range related data sources
has been proposed.
Fusion with low cost active sensors is
planned.
Integration and the test of ‘Hazard &
Incident Warning’ and ‘Speed Alert’
applications.
H&IW and SpA applications
CG22 test site
11
ITS World Congress, Stockholm, Sweden
Annex 1: Data fusion – At the RSU level

At the RSU level, fog presence is determined by the CCTV camera and may be
confirmed or not by the weather station using physical constraints due to fog formation:
Humidity  90% and Wind  7m.s1

Assuming Gaussian variables, Vmet and Vmob are fused to obtain a single descriptor and
determine the visibility range V
Vmetk
Vk 
Vmet

k
1
Vmet

Vmob
k

k
Vmobk
1
Vmob
 1
1
k  

 V
Vmob
k
 metk
1
k
A simple linear KF is then used to compute a weighted iterative least-squares regression:
prediction
 Vˆ k  Vˆk 1
 
Pk  Pk 1  Q
correction
 K k  Pk Pk   k

ˆ
Vk  Vˆk 1  K k Vk  Vˆk 1


Pk  1  K k  Pk

12







ITS World Congress, Stockholm, Sweden
Annex 1: Data fusion – At the road network level (1/2)


At a point of the road network, the visibility range depends on the surrounding
data sources
Each data source has its own uncertainty due to its measurement principle, e.g.:
12
Error [%]
10
8
SAFESPOT camera
6
In-vehicle camera
4
2
0
0
200
400
600
800
Range [m ]

Since fog is local, the uncertainty is also strongly increasing with the distance:
 j  d    j ,s,  d  
j


S ,s
j th source uncertainty
distance related uncertainty
1
S ,s
13
  d 2 




1  2   s 
 e 
s
2


1

d 
u    , u  
1  u   1

s
ITS World Congress, Stockholm, Sweden
Annex 1: Data fusion – At the road network level (2/2)

The visibility distance is thus
expressed by the spatial
barycenter of the different sensors
outputs:
p
Vj



j
V g    j 0  ,s,  d j     V g
 t
t 1
p
1


j

j 0   ,s ,  d j 


with     1.

14

Corresponding uncertainty:
1

 p

1
 g  

1

t
j
    tg1
 j 0   ,s,  d j     



Pk

Fog lamps status
 with       1.

A threshold * is used to filter
uncertain data
ITS World Congress, Stockholm, Sweden
Annex 2: References
[1] M. Jokela, M. Kutila, J. Laitinen, F. Ahlers, N. Hautière, T. Schendzielorz. “Optical Road Monitoring
of the Future Smart Roads – Preliminary Results”, International Journal of Computer and
Information Science and Engineering, 1(4):240-245, 2007
[2] N. Hautière, E. Bigorgne and D. Aubert, “Daytime Visibility Range Monitoring through use of a
Roadside Camera”, IEEE Intelligent Vehicles Symposium (IV’08), Eindhoven, The Netherlands,
June 4-6, 2008.
[3] N. Hautière, E. Bigorgne, J. Bossu and D. Aubert, “Meteorological conditions processing for visionbased traffic monitoring”, IEEE International Workshop on Visual Surveillance (VS2008), in
conjunction with ECCV, Marseille, France, October 2008.
[4] N. Hautière, J. Bossu, E. Bigorgne, A. Boubezoul, N. Hiblot, B. Lusetti, D. Aubert. “Sensing the
visibility range at low cost in the SAFESPOT Road Side Unit”. Accepted in ITS World Congress
(ITS’09), Stockholm, Sweden, September 2009.
[5] N. Hautière, A. Boubezoul, Extensive Monitoring of Visibility Range through Roadside and InVehicle Sensors Combination, submitted to IEEE International Conference on Advanced Video
and Signal-based Surveillance (AVSS’09), Genoa, Italy, October 2009
[6] J. Bossu, N. Hautière, J.-P. Tarel. Utilisation d’un modèle probabiliste d’orientation de segments
pour détecter des hydrométéores dans des séquences vidéo, XXIIème colloque GRETSI
(GRETSI’09), Dijon, France, Septembre 2009
15
ITS World Congress, Stockholm, Sweden