Sensor-Based Mapping and Sensor Fusion

Download Report

Transcript Sensor-Based Mapping and Sensor Fusion

Sensor-Based Mapping and
Sensor Fusion
By Rashmi Patel
Overview




Bayes Rule
Occupancy Grids [Alberto Elfes, 1987]
Vector Field Histograms-[J. Borenstein, 1991]
Sensor Fusion [David Conner, 2000]
Bayes Rule

Posterior (Conditional) probability – probability
assigned to a event given some evidence

Conditional Prob. Example- Flipping coins:


P(H) = 0.5
P(H | H) = 1
P(HH) = 0.25 P(HH | first flip H) =0.5
Bayes Rule- continued



Bayes Rule:
P(A|B) = P(A)P(B|A)
P(B)
The useful thing about Bayes Rule is that it allows you
to “turn” conditional probabilities around
Example: P(Cancer) = 0.1, P(Smoker) = 0.5, P(S|C) = 0.8
P(C|S) = ?
P(C|S) = P(S|C)*P(C) / P(S) = 0.16
Occupancy Grids


[Elfes]
In the mid 80’s Elfes starting
implementing cheap ultrasonic
transducers on an autonomous
robot
Because of intrinsic limitations in
any sonar, it is important to
compose a coherent world-model
using information gained from
multiple reading
Occupancy Grids
y

x

The grid stores the probability
that Ci = cell(x,y) is occupied
O(Ci) = P[s(Ci) = OCC](Ci)
Phases of Creating a Grid:

Ci
Defined


Collect reading generating O(Ci)
Update Occ. Grid creating a map
Match and Combine maps from
multiple locations
Occupancy Grids
22.5 deg




24 transducers, in a ring,
spaced 15 degrees a part
Sonars can detect from 0.935 ft
Accuracy is  0.1 ft
Main sensitivity in a 30
cone

Beam Pattern
Sonar Pattern
-3db sensitivity from middle
15
(1/2 response)
Occupancy Grids
Sonar Model
Probability Profile- Gaussian p.d.f. is used but that is variable
p(r | z,)=1/(2r)*exp[ (-(r-z)2/2r2) - ((2)/2)]

Where r is the sensor reading and z is actual distance
Ranging error
Distance
R
Somewhere Occupied
Probably Empty
Range Measurement
Rmin
angl
e
Occupancy Grids
Discrete sonar model
EX.
NA
0
.01
.45
.48
.49
.5
0.8
.45
.4
.38
.35
.32
.31
.32
.38
.41
.45
0.9
.02
.05
.09
.14
.22
.25
.3
.35
.4
.44
1
.45
.4
.38
.35
.32
.31
.32
.38
.41
.45
0.9
.49
0.5
0.8
Occupancy Grids

y
Definitions:

x


Ci
Notation
Ci is a cell in the occupancy
grid
s(Ci) is the state of cell Ci
(i.e. value of that cell)
OCC means OCCUPIED and
whose value is 1
Occupancy Grids

Bayes Rule
Applying Baye’s Rule to a single cell s(Ci) with sensor
reading r:
P[s(Ci) = OCC | r] = P[r | s(Ci) = OCC] * P[s(Ci) = OCC]
--------------------------------------------------------------------------------
 p[r | s(Ci)] * P[s(Ci)]


Where p(r) =  p[r | s(Ci)] * P[s(Ci)] summed over the cells that
intercept the sensor model
Then apply this to all the cells creating a local map for each
sensor
Occupancy Grids
Likelihood
Bayes Rule Implemented
Prior
P[s(Ci) = OCC | r] = P[r | s(Ci) = OCC] * P[s(Ci) = OCC]
-----------------------------------------------------------------------------------------
Normalize


 p[r | s(Ci) = OCC] * P[s(Ci) = OCC]
P[s(Ci) = OCC | r] is the probability that a cell is Occupied given
a sensor reading r
P[r | s(Ci) = OCC] is the probability that sensor reading is ‘r’
given the state of cell Ci (this value is found by using the sensor
model)

P[s(Ci) = OCC] is the probability that the value of cell Ci is 1 or
that s(Ci) = OCC (this value is taken from the occupancy grid)
Occupancy Grids

Occupied Range

y

x

Implementation Ex
Let the red oval be the
somewhere Occupied region
The Yellow blocks are in the
sonar sector
The black lines are the
boundaries of that sonar sector
P(r) = Sum over all of those
yellow block using the sonar
model to figure out the
Probability P(r)
Occupancy Grids
Multiple Sonars
Combining Readings from
Multiple Sonars:


The Grid is updated sequentially
for t sensors {r}t = {r1,…,rt}
To update for new sensor
reading rt+1:
P[s(Ci) = OCC | rt+1] =
P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t]
--------------------------------------------------------------------------------
 p[rt+1 | s(Ci)] * P[s(Ci) |{r}t]
Occupancy Grids
Equations
P[s(Ci) = OCC | rt+1] =
P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t]
-------------------- -----------------------------------------------------------------------------------------
 P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t]


P[s(Ci) = OCC | rt+1] is the probability that a cell is Occupied
given a sensor reading r
P[rt+1 | s(Ci) = OCC] is the probability that sensor reading is ‘r’
given the state of cell Ci (this value is found by using the sensor
model)

P[s(Ci) = OCC|{r}t] is the probability that the value of cell Ci is 1
or that s(Ci) = OCC (this value is taken from the occupancy grid)
Occupancy Grids
Multiple Maps
Matching Multiple Maps


Occupancy Grid
Each new map must be
integrated with existing maps
from past sensor readings
The maps are integrated by
finding the best rotation and
translation transform which
results in the maps having
best correlation in overlapping
areas
Occupancy Grids
Matching Maps Ex
Example 1: A simple translation of maps
Center of Robot at (2,2)
Map 1:
Map 2: After translating
New Map:
Combined map 1&2
0
0
0
0
0
0
.8
.8
.8
0
0
.98
.98
.98
0
0
.9
.9
.9
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
1
0
0
P(cell3) = P(cell1)+P(cell2)- P(cell1)*P(cell2)
Occupancy Grids Vs Certainty Grids

Occupancy Grids and Certainty Grids basically the
same in the method that is used to




Collect readings to generate Probability Occupied and for
Certainty grids Probability Empty
Create grid from different sonars
Match Maps to register from other locations
Difference arise from the fact that Occ. Grids use
conditional prob. to determine Probability Occupied
while Certainty Grids use simpler math models
Occupancy Grids Vs Certainty Grids


Both have a P.d.f for the sonar model
However the major difference is in finding the
probability that a cell is occupied
 First Pempty is computed for a cell
 Then Poccupied is computed using Pocc = 1-Pemp
 Then Pocc is normalized over the sonar beam and
combined with the value of that cell from other sonars and
Pocc(sonar reading r)
Vector Field Histograms



[Borenstien]
The VFH allows fast
continuous control over a
mobile vehicle
Tested on CARMEL using 24
ultrasonic sensors placed in a
ring around the robot
The scan times range from
100 to 500 ms depending on
the level of safety wanted
Vector Field Histograms


Notation
The VFH uses a two dimensional
Cartesian Histogram grid similar to
certainty grids [Elfes]
Definition:







CVmax = 15
Cvmin = 0
d is the distance returned by the sonar
Increment value is 3
Decrement value is –1
VCP is Vehicle center point
Obstacle vector-vector point from cell to
VCP
Vector Field Histograms



Histogram Grid
The histogram grid is incremented
differently from the certainty grid
The only cell incremented in the grid
is the cell which is distance d away
and lying on the acoustic axis of the
sonar
Similarly only the cells on the
acoustic axis and are less than
distance d are decremented
Vector Field HistogramsPolar Histogram

Active Grid, C*
3 2 4 1
4 1 1 3 2 1
1
1
4
3
2
k targ
2
3 1
4 1
2
Robot
2 3 1
3 1 4 2 1
3 3 4 5 1 1
4 5
3

Next the 2-D histogram grid is
converted into a 1-D grid called
the Polar histogram
The Polar Histogram, H, has n
angular sections with width a
Vector Field Histograms

H mapping
In order to generate H, we must map an every cell in
the histogram grid into H
 i, j
y  y 
 j
0

 arctan
 x  x 
0
 i
x i, y j
mi,,j
i,,j
  a  bd 
mi , j  ci*, j
2
Where
x0 , y0  the robot's position
x i , y j  the i,j
th
active cell's position
x 0, y 0
i, j
 i , j  the directionfrom x i , y j  to the VC P
a,b  positive constants


 the distancebetweenx , y  and the VC P
 the magnitudeof the obstaclevector at x , y 
ci*, j  the certainty value of x i , y j
di , j
mi. j
i
j
i
j
Vector Field Histograms

Discrete H grid
Now that the Object vectors for every cell have been computed, we
have to find the magnitude of each sector in H
k  int 
 i, j
a
m
hk 
i, j
i, j
Where
a  the angular resolutionof H

k  the sector of x i ,y j

hk  the polar obstacledensity
Vector Field Histograms


Threshold
Once the Polar Object Densities have
been computed, H can be threshold
to determine where the objects are
located so that they can be avoided.
The choose of this threshold is
important. Choosing to high a
threshold and you may come too
close to a object and too low may
cause you to lose some valid paths
Sensor Fusion
[D. Conner]
David C Conner
PhD Student
Presentation on his thesis and the
following paper:
“Multiple camera, laser rangefinder, and
encoder data fusion for navigation of a
differentially steered 3-wheeled autonomous
vehicle”
Sensor Fusion



Navigator- 3wheeled differentially
driven vehicle

Navigator
2 front wheels are driven and
third rear wheel is a caster
2 separate computer systems,
PC handles sensor fusion and
PLC handles motor control
180 degree laser range finder
with 0.5 resolution
2 color CCD cameras
Cameras and Frame Grabbers


Because the camera is not
parallel to the ground the
image must be transformed
to correctly represent the
ground
The correction is done
using the Intel Image
Processing Library (IPL)
Cameras and Frame Grabbers


Since there are two
cameras the two images
must be combined
The images are
transformed into the
vehicle coordinates and
combined using the IPL
functions
Image Conversion
Once a picture is captured

It is converted to gray scale

It is blurred using a gaussian Convolution mask
Example shown is below
Image Conversion continued



Then the threshold of image is taken to limit the amount of
data in the image
The threshold value is chosen to be above the norm of the
intensities from the gray scale histogram
Then resulting image is pixilated to store in a grid
Laser Range Finder


SICK LMS-200 laser rangefinder return 361 data points
for a 180 degree arc with 0.5 degree resolution
Values above a certain range or ignored
Vector Field Histograms

VFH are nice because it allows us to easily
combine our camera data and our laser
rangefinder data to determine most accessible
regions
Several types of polar
obstacle density (POD)
functions can be used

(linear, quadratic, exponential)

POD = KC(a-b*d)
POD values-Laser Rangefinder

The POD values for the laser are determined by:
 Using the linear function shown above to transform
the laser data into POD values
 Then for every two
degrees a max of
the POD values in
that arc is chosen as
the final value
POD values-Images


POD values for the image are pre-calculated and stored
in a grid at startup
The pre-calculated values are multiplied by the pixilated
image also stored in a
grid. (The overlapping
cells would be multiplied)
For every 2degree arc the
cell with the highest POD
values is chosen to the
value of that arc

Combining VFH


The two VFH’s are then combined by taking the max POD
for each sector
The max POD is chosen because that represents the
closest object
Bibliography




Elfes, A. “Occupancy Grids: A Stochastic Spatial Representation
for Active Robot Perception.” July 1990
Elfes, A. “Sonar-Based Real-World Mapping and Navigation”,
June 1987
Borenstein, J. “The Vector Field Histogram- Fast Obstacle
Avoidance for Mobile Robots”, June 1991
Conner, D. “Multiple camera, laser rangefinder, and encoder
data fusion for navigation of a differentially steered 3-wheeled
autonomous vehicle”,