SLAM Using Single Laser Range Finder

Download Report

Transcript SLAM Using Single Laser Range Finder

SLAM USING SINGLE LASER
RANGE FINDER
AliAkbar Aghamohammadi, Amir H. Tamjidi, Hamid D.
Taghirad
Advance Robotic and Automation Systems Lab (ARAS),
Electrical and Computer Engineering Department
K. N. Toosi University of Technology, Iran
OUTLINE
1-Motivation & Contributions
2-Probabilistic Framework
3-Feature Extraction
4-Error Modeling For Individual Features
5-Motion Prediction
6-Data Association
7-Adding new features
8-Filtering (IEKF)
9-Results
10-Conclusion
11-Refrences
400
350
300
250
200
150
100
50
0
-200
-150
-100
-50
0
50
100
150
q
d
αi+1
fk (real feature in
the environment)
μ
Pi (selected edge
feature)
ri+1
ri
β β
covariances associated with features
xk
Matching
xk+1
cov(xk)
extracted features
from k'th scan
extracted features
from (k+1)'th scan
optimization process
implicit function theorem
cov(xk+1)
Prediction Box
2
MOTIVATION

traditional encoder-base dynamic modeling are
sensitive to:
a)
b)
c)
slippage
surface type changing
imprecision in the parameters of robot's hardware.
LSLAM is a significant step toward encoderfree SLAM and it is robust with respect to
slippage and problems associated with
encoder-base motion models.
3
MAIN CONTRIBUTIONS

The key contributions of LSLAM include:
1. Robust feature extraction method
2. Accurate error modeling for individual
extracted features
3. Uncertainty estimation in feature-based
range scan matching
4. Achieving real-time drift-free solution
for SLAM in restricted structured
environments using a single laser range
finder as the only data source
4
PROBABILISTIC FRAMEWORK

State Vector of the system comprises of robot pose and spatial
features, represented in world coordinates
Robot Pose
 xrk 
t
features


xk 
, x f k  f1 f 2
fn
 xf 
 k
At system start-up, feature-based map is initialized; this map is
updated dynamically by the Extended Kalman Filter until
operation ends. The probabilistic state estimates of the robot and
features are updated during robot motion and feature
observation. When new features are observed the map is enlarged
with new states.
 xˆ r 
 Px x Px f Px f

 
P

P
P
 fˆ1 
f f
f f

xˆ    , P   f x
 Pf x Pf f Pf f

ˆ
f 2 


 


 



r
r
r 1
r 2
1 1
1 1
1 2
2 1
2 1
2 2
5
OUTLINE
1-Motivation & Contributions
2-Probabilistic Framework
3-Feature Extraction
4-Reliability Measure Calculation
5-Motion Prediction
6-Data Association
7-Adding new features
8-Filtering (IEKF)
9-Results
10-Conclusion
11-Refrences
400
350
300
250
200
150
100
50
0
-200
-150
-100
-50
0
50
100
150
q
d
αi+1
fk (real feature in
the environment)
μ
Pi (selected edge
feature)
ri+1
ri
β β
covariances associated with features
xk
Matching
xk+1
cov(xk)
extracted features
from k'th scan
extracted features
from (k+1)'th scan
optimization process
implicit function theorem
cov(xk+1)
Prediction Box
6
FEATURE EXTRACTION
 xrk
xk  
 xf
 k






xˆ f  



fˆ1 

fˆ2 



Point Features
Line Features
More Informative
Features
Features have to be
1.invariant wrt displacement
2.robust wrt data association
7
FEATURE EXTRACTION
Steps
Scan Data
Segmentation
Detection of
High
Curvature
points
Discarding
variant
features
features
8
OMITTING VARIANT FEATURES
There exist two kind of variant features:
1) Those, appear due to occlusion
2) Those, appear due to low incidence angle

9
FEATURE EXTRACTION RESULTS
Jump
edge
Occlusion
(cm)
400
Variant feature
350
Extracted Features
300
Low
High
Jump
incidence
Occlusion
Curvature
edge
angle
250
Low
incidence
angle
200
150
High
Curvature
Variant feature
Invariant features
100
50
0
-200
-150
-100
-50
0
50
100
150 (cm)
10
OUTLINE
1-Motivation & Contributions
2-Probabilistic Framework
3-Feature Extraction
4-Reliability Measure Calculation
5-Motion Prediction
6-Data Association
7-Adding new features
8-Filtering (IEKF)
9-Results
10-Conclusion
11-Refrences
400
350
300
250
200
150
100
50
0
-200
-150
-100
-50
0
50
100
150
q
d
αi+1
fk (real feature in
the environment)
μ
Pi (selected edge
feature)
ri+1
ri
β β
covariances associated with features
xk
Matching
xk+1
cov(xk)
extracted features
from k'th scan
extracted features
from (k+1)'th scan
optimization process
implicit function theorem
cov(xk+1)
Prediction Box
11
RELIABILITY MEASURE CALCULATION
FOR INDIVIDUAL FEATURES

Feature uncertainty
Observation noise
 Uncertainty due to quantization
Fig. 5

fi  pi  eob ,i  eq ,i
q
d
αi+1
μ
fk (real feature in
the environment)
Pi (selected edge
feature)
ri+1
ri
ββ
pi
er
ri
eө
i
12
MEASUREMENT NOISE
pi
er
ri
eө
e r  ari  b
i
i

pin  ri  eri
eob  p i n  p i

 cos(i  e ) 
 cos(i ) 

 , pi  ri 

sin(


e
)
sin(

)
i
 
i 


  sin(i ) 
 cos(i ) 
eob  re
  (ari  b) 

i  
cos(

)
sin(

)
i
i 



13
QUANTIZATION ERROR
This issue causes that the point pi, considered as a feature point, not
necessarily be the same physical feature in the environment.
eq   t
q
αi+1
d
fk (real feature in the environment)
μ
Pi (selected edge feature)
ri+1
ri
ri-1
β β
14
FEATURE COVARIANCE
Measurement and quantization errors are
independent from each other
Cov (f k )  E (f k  fˆk )T (f k  fˆk )  Cov (eobi )  Cov (eq i )



Cov  e
 q
 i

 qd2i  cos 2 ( i )
cos( i )sin( i ) 
 


2
3
cos(

)sin(

)
sin
(

)
i
i
i



 sin 2 ( )
-sin( ) cos( ) 

i
i
i 
Cov(e )  r  2 

ob
i 
2
i
 -sin( ) cos( )
cos ( ) 
i
i
i


 cos 2 ( )
sin( ) cos( ) 

i
i
i 
  a2 r 2   b2 

i
 sin( ) cos( )
cos 2 ( ) 
i
i
i




15
(cm)
310
OUE
WUE
305
300
295
290
285
400
-70
-65
-60
-55
-50
-45
-40 (cm)
OUE
350
300
WUE
250
200
150
100
16
50
0
-200
-150
-100
-50
0
50
100
150
OUTLINE
1-Motivation & Contributions
2-Probabilistic Framework
3-Feature Extraction
4-Reliability Measure Calculation
5-Motion Prediction
6-Data Association
7-Adding new features
8-Filtering (IEKF)
9-Results
10-Conclusion
11-Refrences
400
350
300
250
200
150
100
50
0
-200
-150
-100
-50
0
50
100
150
q
d
αi+1
fk (real feature in
the environment)
μ
Pi (selected edge
feature)
ri+1
ri
β β
covariances associated with features
xk
Matching
xk+1
cov(xk)
extracted features
from k'th scan
extracted features
from (k+1)'th scan
optimization process
implicit function theorem
cov(xk+1)
Prediction Box
17
MOTION PREDICTION
wk , cov(wk)
xk
xk+1=f(xk,uk,wk)
Traditional
Method
xk+1
cov(xk)
uk
T
T
cov(xk+1)=Fxcov(xk)Fx +Fwcov(wk)Fw
cov(xk+1)
Prediction Box

Traditional models, based on encoders' data,
suffer from some problems in motion modeling
such as wheel slippage, unequal wheel diameters,
unequal encoder scale factors, inaccuracy about
the effective size of wheel base, surface
irregularities, and other predominantly
environmental effects
18
MOTION PREDICTION
covariances associated with features
xk
Matching
LSLAM
Method
xk+1
cov(xk)
extracted features
from k'th scan
extracted features
from (k+1)'th scan
optimization process
implicit function theorem
cov(xk+1)
Prediction Box

we use a prediction model, which does not merely rely
on robot, but it uses environmental information too.
Thus, method is robust with respect to wheel slippage,
surface changing and other unsystematic effects and
inaccurate information about robot's hardware.
19
MOTION PREDICTION
(cm)
200

Matching:
connecting line
150
100
50
0
robot
-50
-100
-150
-200
-250

Pose Shift Calculation
( Cost function based on weighted feature-based
Range scan matching )
-300
-300
m
E

j 1
-200
-100
0
100
200 (cm)
pre
new
t
1 ˆ pre
new
ˆ
ˆ
ˆ
( f j  ( Rf j  T )) C j ( f j  ( Rf j  T ))
20
MOTION PREDICTION – Uncertainty Calculation
If there was an explicit relationship between features
and pose shift:
X  g (F )
*
But
X there
ˆ
ˆ
 g (F )  (F  F )
 O (F  Fˆ )
F
is not !!!
*
X
*
2
cov( X )  J cov( F ) J
Indeed, Since T* and R* have to minimize the cost function
E, we have an implicit relationship derived from:
E
X contains the parameters
( X , F ) 
0
of T and R.
X X  X *
Thus there is an implicit relationship between features
and pose shift.
T
*
21
X
MOTION PREDICTION – Uncertainty Calculation
J 

*
F
The implicit function theory can provide the
desired Jacobian via below equation:
1
     
J  
  
 X   F 
*
1
  E   2 E 
J  
 F X 
2 

X

 

2
cov( X ) X  X *  J cov( F ) J
at
X X
*
complicated but a
tractable matter of
differentiation
T
22
OUTLINE
1-Motivation & Contributions
2-Probabilistic Framework
3-Feature Extraction
4-Reliability Measure Calculation
5-Motion Prediction
6-Data Association
7-Adding new features
8-Filtering (IEKF)
9-Results
10-Conclusion
11-Refrences
400
350
300
250
200
150
100
50
0
-200
-150
-100
-50
0
50
100
150
q
d
αi+1
fk (real feature in
the environment)
μ
Pi (selected edge
feature)
ri+1
ri
β β
covariances associated with features
xk
Matching
xk+1
cov(xk)
extracted features
from k'th scan
extracted features
from (k+1)'th scan
optimization process
implicit function theorem
cov(xk+1)
Prediction Box
23
DATA ASSOCIATION


Batch data association methods greatly reduce
the ambiguity in data association process. Thus,
here JCBB method is adopted for data
association.
After data association process, extracted features
from new scan fall into two categories:
New features, which are not matched with any
existent feature in the map
 Existing features, (matched ones)

24
FILTERING AND ADDING NEW FEATURES

Existing features, (matched ones) are used to update the
system state vector
1-calculate kalman gain
Kk1,i  cov( xk1 )H x [H x cov( xk1 ) H x  cov( Fk1 )]
( )
T
( )
T
r
1
2-calculate state vector update
()
()
r
()
()
()
xˆk1,i1  xˆk1  Kk1,i [Fk1  (h( xˆk1,i )  H x ( xˆk1  xˆk1,i ))]
3-calculate covariance update
cov( xk 1,i1 )  cov( xk 1 )  Kk 1,i H x cov( xk 1 )
 Each newly seen feature is first transformed to
the map reference coordinate and then the
transformed feature is augmented with the  xˆk 1 
system state vector.
xˆ augmented   w 
()
()
()
k 1
 fˆnewly 
 seen 
25
OUTLINE
1-Motivation & Contributions
2-Probabilistic Framework
3-Feature Extraction
4-Reliability Measure Calculation
5-Motion Prediction
6-Data Association
7-Adding new features
8-Filtering (IEKF)
9-Results
10-Conclusion
11-Refrences
400
350
300
250
200
150
100
50
0
-200
-150
-100
-50
0
50
100
150
q
d
αi+1
fk (real feature in
the environment)
μ
Pi (selected edge
feature)
ri+1
ri
β β
covariances associated with features
xk
Matching
xk+1
cov(xk)
extracted features
from k'th scan
extracted features
from (k+1)'th scan
optimization process
implicit function theorem
cov(xk+1)
Prediction Box
26
RESULTS
Melon: a tracked mobile robot
equipped with two low range
Hokuyo URG_X002
laser range scanners
(High Slippage)

An Structured Environment
27
PURE LOCALIZATION
ICP Method
Advantages
Disadvantages
ICP method is a popular pointwise method. It is a powerful
method, but it needs prior
information about
displacement.
28
RESULTS(PURE LOCALIZATION)
HAYAI Method
Advantages
Disadvantages
HAYAI method produces
impressive results in term of
processing speed. But it
suffers from some
disadvantages.
29
PURE LOCALIZATION
Proposed motion model
Advantages
Disadvantages
30
LSLAM
Simulation Results
The environment consists
of many features.
 Ground truth is available
 Loop closing effects can be
investigated in a large loop

800
700
600
500
400
300
200
100
31
0
-100
End of path
(after a lap)
Start of path
-200
-800
-700
-600
-500
-400
-300
-200
-100
0
100
200
LSLAM - SIMULATION
2.5
2
1.5
error in X direction
1
0.5
0
-0.5
-1
Error in x
-1.5
-2
0
20
40
60
80
step number
100
120
errors are
considerably
reduced at
loop closures.
140
0.3
error in heading angle
0.2
0.1
0
-0.1
Uncertainties
remain
bounded
-0.2
Error in θ
-0.3
-0.4
0
20
40
60
80
step number
100
120
140
2
1.5
Estimated errors (blue curves)
and estimated variances (red
curves) in x, y and theta (robot
heading)
error in Y direction
1
0.5
0
-0.5
-1
Error in y
-1.5
-2
0
20
40
60
80
step number
100
120
140
32
LSLAM (REAL SCAN DATA)
LSLAM
300
250
2
4
2
200
1.5
3
150
y (cm)
0.5
error in y direction
0
-0.5
50
0
0
-0.5
Error and
uncertainty
are bounded <
Feature-based
Error
in ymap
1.5cm.
resulted from LSLAM
-1
Error and
uncertainty
are bounded <
Error
2cm.in x
Pure Localization
-50
50
100
-1.5
150
-200
-100
x (cm)
50
0
100
-1
-2
100
-4
0
0
-100
Error and
uncertainty
are bounded <
Error 1deg.
in θ
50
100
Error growing
unboundedly
150
200
150
100
> 20cm
50
0
-50
-100
-200
> 10deg
50
0
-50
33
-150
-100
-300
150
step number
150
Error growing
unboundedly
100
> 20cm
100
-300
250
error in y direction
error in x direction
200
0
step number
Error growing
unboundedly
300
1
-3
-100
-2
0
step number
400
2
-1
-1.5
-2
0
100
0.5
error in heading angle
error in x direction
1
error in heading angle
1.5
1
-200
-400
0
50
100
step number
150
-250
0
50
100
step number
150
-150
0
50
100
step number
150
8-CONCLUSION
introducing robust motion model with respect to
robot slippage and inaccuracy in hardwarerelated measures
 calculating reliability measure for robot’s
displacement derived through the feature-based
laser scan matching
 Extract features in different scales
 construct an IEKF framework merely based on
laser range finder information

34
9-REFERENCES


[1] Robot pose estimation in unknown environments by matching 2D range scans. Lu, F. and Milios, E. 1997. 1997,
Journal of Intelligent and Robotic Systems, Vol. 18, pp. 249-275.
[2] Metric-based scan matching algorithms for mobile robot displacement estimation. Minguez, J., Lamiraux, F. and
Montesano, L. 2005. Barcelona, Spain. : s.n., 2005. Proceedings of the IEEE International Conference on Robotics and
Automation (ICRA).

[3] Scan alignment with probabilistic distance metric. AJensen, B. and Siegwart, R. 2004. 2004. Proceedings of the
IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4] Weighted range sensor matching algorithms for mobile robot displacement estimation. Pfister, S., et al. 2002. s.l. :
Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’02), 2002. pp. 1667-1674.

[5] Feature-Based Laser Scan Matching For Accurate and High Speed Mobile Robot Localization. Aghamohammadi,
A.A., et al. 2007. s.l. : European Conference on Mobile Robots (ECMR’07), 2007.

[6] High-speed laser localization for mobile robots. Lingemann, K., et al. 2005. 4, s.l. : Journal of Robotics and
Autonomous Systems, 2005, Vol. 51, pp. 275–296.

[7] Natural landmark-based autonomous vehicle navigation. Madhavan, R. and Durrant-Whyte, H. F. 2004. s.l. :
Robotics and Autonomous Systems, 2004, Vol. 46, pp. 79-95.

[8] Mobile robot positioning with natural landmark. Santiso, E., et al. 2003. Coimbra, Portugal : s.n., 2003. Proceedings
of the 11th IEEE International Conference on Advanced Robotics (ICAR’03). pp. 47-52.

[9] Recursive Scan-Matching SLAM. Nieto, J., Bailey, T. and Nebot, E. 2007. 1, s.l. : Journal of Robotics and
Autonomous Systems, January 2007, Vol. 55, pp. 39-49.

[10] Nieto, J. 2005. Detailed environment representation for the slam problem. Ph.D. Thesis. s.l. : University of Sydney,
Australian Centre for Field Robotics, 2005.

[11] Globally consistent range scan alignment for environment mapping. Lu, F. and Milios, E. 1997. : Autonomous
Robots, 1997, Vol. 4, pp. 333–349.

[12] An Interior, Trust Region Approach for Nonlinear. Coleman, T.F. and Y., Li. 1996. s.l. : SIAM Journal on
Optimization, 1996, Vol. 6, pp. 418-445.

[13] Data association in stochastic mapping using the joint compatibility test. Neira, J. and Tardos, J.D. 2001. 6, s.l. :
IEEE Transactions on Robotics and Automation, 2001, Vol. 17, pp. 890–897.

[14] Gelb, A. 1984. Applied Optimal Estimation. s.l. : M.I.T. Press, 1984.

[15] A real-time algorithm for mobile robot mapping with applications to multi-robot and 3D mapping. Thrun, S.,
Bugard, W. and Fox, D. 2000. s.l. : International Conference on Robotics and Automation, 2000. pp. 321–328.
35