Transcript Slide 1

Vision Aided Inertial Navigation
using Single Video Camera Measurements
Vasko Sazdovski
Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia
Pilot Training Centre, ELMAK Elbit Systems of Macedonia, Airport Petrovec, Skopje, R.Macedonia
ROBOTILNICA Intelligent Systems and Robotics, Skopje, R.Macedonia
OUTLINE OF THE PRESENTATION
• INTRODUCTION
• THEORY OF SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM). INERTIAL NAVIGATION
AND AUGMENTED STATE VECTOR.
• VISION SENSORS
• INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING. SLAM AS
SENSOR FUSION ALGORITHM
• INERTIAL NAVIGATION AIDED BY BEARING ONLY SIMULTANEOUS LOCALIZATION AND
MAPPING
• CONCLUSIONS
• FUTURE WORK POSSIBILITIES
2
INTRODUCTION
Future UAV’s have a need of high levels of autonomy and independence. They are going to become small,
very agile and highly maneuverable.
MeCam quadrotor
•
•
•
•
Prof. Raffaello D'Andrea from ETH Zurich at
ZURICH.MINDS 2012 showing the early
quadrotor designs within its research group…
•
•
•
14 sensors including side object detectors
that enable "perfect and safe hovering”;
Voice control;
Autonomous person following;
Two Morpho autopilots and a video
stabilization system;
Autonomous panoramas;
Real-time streaming to mobile devices;
$50 MSRP, potential availability "by the
beginning of 2014“.
INTRODUCTION
Integrated navigation system design requires selection of:
•
•
set of a sensors and
computation power
that provides reliable and accurate navigation parameters (position, velocity and attitude) with high
update rates and bandwidth in small and cost effective manner.
Navigator SL Cloud Cap Technology USA
Modern INS/GPS Navigation System for UAV’s
Size: 131 x 57 x 19 mm
Weight: 80 grams
Piccolo autopilot used on UAV for
collecting meteorological or map
data over long distances. Cloud Cap
Technology USA.
4
INTRODUCTION
Integrated navigation system design requires selection of:
•
•
set of a sensors and
computation power
that provides reliable and accurate navigation parameters (position, velocity and attitude) with high
update rates and bandwidth in small and cost effective manner.
Always Innovating Cortex-A9 SoC Module
which can run from 1.0GHz to 1.5GHz
1GB RAM, SD card, 2.4GHz/5GHz Wi-Fi and
Bluetooth connectivity.
MeCam quadrotor
5
INTRODUCTION
Many of today’s operational navigation systems rely on inertial sensor measurements.
Single axis gyro
Atomic Inertial Measurement
Unit (IMU) for Sparkfun Inc.
Three axis accelerometer
OR
TIO
SI
PO
N
R
ER
DESIRED
ERROR
0
TIME
OR
IT
OC
VEL
RR
YE
NAVIGATION ERROR
NAVIGATION ERROR
NAVIGATION ERROR
Inertial navigation is diverging slowly from the real measurements with time.
ATTITUDE ERROR
DESIRED
ERROR
DESIRED 0
ERROR
0
TIME
TIME
INTRODUCTION
Additional sources of aiding information to the navigation parameters
( non interactive aiding information)
LORAN
OMEGA
NAVSAT
DME
AREA CORRELATOR
DOPPLER
AIR DATA
GPS
INERTIAL
POSITION
SPEED
ATTITUDE
HEADING
MAGNETIC
COMPASS
VERTICAL
GYRO
AHRS
DIRECTIONAL
GYRO
Ching-Fang Lin “Modern Navigation Guidance and Control Processing”
Prentice Hall Inc. 1991
7
INTRODUCTION
Additional sources of aiding information to the navigation parameters
(interactive aiding information)
INERTIAL
POSITION
SPEED
KNOWN
ENVIRONMENT
VISION SENSORS
ATTITUDE
HEADING
Vision sensors are must nowadays
INERTIAL
POSITION
SPEED
UNKNOWN
ENVIRONMENT
NOVEL SOLUTIONS
EX. VISION-BASED
SLAM
ATTITUDE
HEADING
THEORY OF SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM).
INERTIAL NAVIGATION AND AUGMENTED STATE VECTOR.
m2
p3
rˆ3
m3
3
2
Vehicle
state vector
p4
rˆ2
rˆ4
p2
4
m4
m1
p5
Augmented
state vector
1
rˆ5
rˆ1
p1
5
m5
7
p6
rˆ7
rˆ6
m7
 xkv 
 
m1 

xk 
  
 
 m7 
Map points
p7
6
m6
Simultaneous Localization and Mapping (SLAM) uses relative measurements (range and bearing)
from the vehicle with respect to the environment to build a map whilst simultaneously using the
generated map computes vehicle position.
9
THEORY OF SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM).
INERTIAL NAVIGATION AND AUGMENTED STATE VECTOR.
Repeated relative measurements of a
map point (circular movement)
p2
rˆ2
p3
rˆ3
rˆ1
2
Mapping of an unknown map point
p1
xkmi  pk   k rˆk
3
1
p4
4
rˆ4
m
5
0
rˆ0
p0
6
rˆ5
p5
rˆ6
Localization using the map point
pk  xkmi  k rˆk
p6
provide valuable information’s of the
spatial relationship of the map point and
the vehicle
10
VISION SENSORS
Single video camera
Stereo video camera
yb
yb
 zb
P
v
rb
0
B
P
xb
v rˆ
0
0
B
rb
B
u
p
 zb
v
 zb
xb
P
xb D
yb
u
u
p
b
b
p

b


z
z
y
y
m
0
M
M
x
Observation models
 xi  x 


z ki  h i [ x kv , x kmi , v k ]   y i  y   v k
 zi  z 
Linear model


Vehicle position
m
0
x
Nonlinear model


xi  x


Map point
 ( x i  x) 2  ( y i  y ) 2  ( z i  z ) 2 


yi  y
i
i
v
mi

  vk
z k  h [ xk , xk , vk ] 
 ( x i  x) 2  ( y i  y ) 2  ( z i  z ) 2 


zi  z


i
2
i
2
i
2 

Measurement noise
 ( x  x)  ( y  y )  ( z  z ) 
INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING.
SLAM AS SENSOR FUSION ALGORITHM
v
When SLAM is performed on aerial vehicles the vehicle state xk can be represented by Inertial Navigation
mechanization equations which give position, velocity and attitude quaternion:


 pkn  
vkn1t  pkn1

 n  n b
n
n 
 vk   [Cb f k  g ]t  vk 1 
 qk   1 q t  q

 
k 1
 2 k 1

As we proposed the approach is to augment the vehicle state with number of map point estimates of the map
point:
mi
i
i
i T
xk  [ x
y
i  1,2,, n .
z]
The augmented state vector containing the vehicle state and the map point estimates is denoted as:
xk  [ xkv
T
T
xkm1
T
 xkmn ]T
The augmented process model is of the form
 f v ( xkv1 , uk )  wkv 


xkm11


xk  f ( xk 1 , uk )  wk  




mn


x
k 1


where uk is the control input containing the acceleration
measurements wk , is the process noise.
f kb  [ f xb
f yb
f zb ]T and kb  [ p q r ]T
gyro
INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING.
SLAM AS SENSOR FUSION ALGORITHM
The equation for calculation of the map point estimates written in component form is
x kmi
where
vk is the measurement noise.
 x  rˆx 
 g i [ x kv , z k , v k ]   y  rˆy 
 z  rˆz 
The method for augmenting the map point estimates and building the covariance matrix is as follows. The new
map point estimate is added to the augmented state vector and a new row and column are added to the
covariance matrix:
 x

xk  f aug [ xk , zk ]   i k

 g [ xk , zk ]
xT
 P

P

g

T
k
k
Pk  f aug () Pk f aug ()   xk

x
xT
z
zT
g k Pk g k Pk g k  g k Rk g k 
where g kz and gkx are Jacobians of function g () shown above with respect to the augmented state
observation z k , respectively
xk
and
These equations are repeated until we augment the state vector with “sufficient” number of map point estimates.
Through simulations we realized that we need more then three map point estimates.
INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING.
SLAM AS SENSOR FUSION ALGORITHM
zk  [x
As the vehicle circles around the map point, repeated measurements
stereo camera.
 y z ]T
are available from the
Using these measurements we can start updating the augmented state vector by using each of the map point
estimates at a time. Extended Kalman Filter (EKF) can be implemented for the update.
The state covariance is propagated first
T
Pk |k 1  f kx Pk 1|k 1f kx  Qk
where
f kx
is the Jacobian of the augmented process model with respect to the augmented state vector.
Next using the measurements (observations) we update the augmented state vector and its covariance:
xˆk|k  xˆk|k 1  Kk vk
Pk|k  Pk|k 1  Kk hkx Pk|k 1
where the innovation vector and the Kalman gain are calculated as:
vk  zk  h( xˆk|k 1 )
T
T
K k  Pk |k 1hkx [hkx Pk |k 1hkx  Rk ]1
hkx
is the Jacobian of the observation function
used in the update.
h ()
with respect to the vehicle state and the map point estimate
PERFORMANCES OF INERTIAL NAVIGATION
AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING
position errors
10
8
Aerial vehicle-3D
6
4
2
x [m]
2000
0
-2
1500
h [m]
-4
-6
1000
-8
-10
500
0
50
100
150
200
250
300
350
250
300
350
t[s]
position errors
0
6000
15
6000
4000
10
4000
2000
404
2000
0
y [m]
0
5
x[m]
y [m]
405
Aerial vehicle-3D
403
3.2
0
402
3
-5
401
2.8
400
553
-10
400
0
50
100
150
200
t[s]
552
551
399
550
Map point estimates
549
548
398
300
position errors
4
200
3
100
2
1000
0
1000
800
800
600
600
400
400
y [m]
200
200
1
z [m]
554
h [m]
555
0
-1
x[m]
True estimated and divergent
vehicle trajectories
-2
-3
0
50
100
150
200
250
t[s]
Position errors
300
350
PERFORMANCES OF INERTIAL NAVIGATION
AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING
map point
determinant Pxm1
402.5
0.04
402
0.035
401.5
0.03
0.025
400
[m]
x [m]
401
400.5
399.5
0.02
2
1
399
0.015
0
398.5
398
0
50
100
150
200
250
300
0.01
-1
0.005
-2
5
350
t[s]
5
map point
0
552.5
50
100
150
200
250
300
0
0
350
Determinant of the covariance
matrix of the first map point
estimate
551.5
551
y [m]
0
t[s]
552
550.5
550
-5
-5
1
549.5
0
549
548.5
548
-1
2
0
50
100
150
200
250
300
350
t[s]
1.5
1
2
0.5
map point
1
0
4
-0.5
3.8
0
-1
-1
-1.5
3.6
-2
-2
3.4
z [m]
3.2
3
The contour ellipsoid of the
covariance matrix of the first
map point estimate
2.8
2.6
2.4
2.2
2
0
50
100
150
200
250
300
Map point estimates
t[s]
350
PERFORMANCES OF INERTIAL NAVIGATION
AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING
determinant Pv
0.2
0.18
0.16
0.14
4
0.12
5
[m]
2
-3
distance
x 10
0.1
0
0.08
4.5
-2
0.06
4
-4
2
0.04
3.5
2
1
0.02
0
-1
0
50
100
150
200
250
300
-1
350
-2
t[s]
-2
Determinant of the covariance
matrix of the vehicle position
2
1.5
1
2
0.5
1
0
50
100
150
200
250
300
t[s]
The relative distance between
the first and second map point
estimates
350
z [m]
[m]
0
2.5
0
1
0
3
0
-1
-2
3
2
4
1
0
2
-1
0
-2
y [m]
-2
-3
-4
x [m]
The contour ellipsoid of the
covariance matrix of the
vehicle position
17
INERTIAL NAVIGATION AIDED BY BEARING
ONLY SIMULTANEOUS LOCALIZATION AND MAPPING
Bearing Only Simultaneous Localization and Mapping (BOSLAM) is very attractive these days because it permits
the use of single camera as sensor for measuring the bearing i.e. unit direction to the map points.
The major drawback of this solution is the problem of map point initialization from a single measurement.
18
INERTIAL NAVIGATION AIDED BY BEARING
ONLY SIMULTANEOUS LOCALIZATION AND MAPPING
Much of the research in BOSLAM is focused towards the problem of initialization of the map points from
single camera measurements. In the literature two techniques are proposed to address the problem of map
point initialization.
1.
The first technique involves delaying the map point
initialization until a criterion is fulfilled and sufficient
baseline is available from different vehicle positions.
Both techniques have their pros and cons
2.
The second technique tries to avoid the delay and initialize
the map point from a single measurement The fact that
after the first observation, the map point lies along on the
line from the vehicle to the map point (the projection ray) is
used
The approach presented before to augment the state vector not only with one map point estimate brings new
parameterization of the map point in BOSLAM.
The novelty comes from the usage of the certain number of map point estimates for update of the whole
augmented state vector together with a combination of repeated measurements and motion in vicinity of the
map point. This approach brings delayed initialization of the map points.
19
INERTIAL NAVIGATION AIDED BY BEARING
ONLY SIMULTANEOUS LOCALIZATION AND MAPPING
Discussion on the observability of a system provides insights and understanding of the fundamental limits
of the estimation processes.
Since observability analysis can give the best achievable performance even before the system is built, it
can be considered as tool for computer analysis of many complicated estimation processes.
map point estimate
vehicle position
0.1
0.5
information x-axis
information y-axis
information z-axis
0.09
0.4
0.08
0.35
0.07
0.3
0.06
information
information
0.45
0.25
0.2
0.04
0.03
0.1
0.02
0.05
0.01
0
10
20
30
40
50
60
iterations with time
70
80
90
0
100
Stationary Vehicle
0.05
0.15
0
information x-axis
information y-axis
information z-axis
0
10
20
30
80
90
100
0.1
information x-axis
information y-axis
information z-axis
0.45
information x-axis
information y-axis
information z-axis
0.09
0.4
0.08
0.35
0.07
0.3
0.06
information
information
70
map point estimate
vehicle position
0.5
0.25
0.2
0.15
Vehicle moving on a circular path
0.05
0.04
0.03
0.1
0.02
0.05
0
40
50
60
iterations with time
0.01
0
10
20
30
40
50
60
iterations with time
70
80
90
100
0
0
10
20
30
40
50
60
iterations with time
70
80
90
100
INERTIAL NAVIGATION AIDED BY BEARING
ONLY SIMULTANEOUS LOCALIZATION AND MAPPING
Inertial Navigation aided by BOSLAM as well as many other navigation problems are nonlinear and must be
linearized (approximated) before applying the popular Kalman-like filtering algorithms. An EKF presents
one such approximation.
The EKF is very commonly used algorithm and, because of its simplicity, is very often chosen as the ”best”
algorithm for implementation.
Single video camera
 zb
P
yb
xb
v rˆ
Nonlinear observation model
0
B


xi  x


 ( x i  x) 2  ( y i  y ) 2  ( z i  z ) 2 


yi  y
i
i
v
mi

  vk
z k  h [ xk , xk , vk ] 
 ( x i  x) 2  ( y i  y ) 2  ( z i  z ) 2 


zi  z


 ( x i  x) 2  ( y i  y ) 2  ( z i  z ) 2 


u
p
b

z
y
M
m
0
x
21
INERTIAL NAVIGATION AIDED BY BEARING
ONLY SIMULTANEOUS LOCALIZATION AND MAPPING
Inertial Navigation aided by BOSLAM as well as many other navigation problems are nonlinear and must be
linearized (approximated) before applying the popular Kalman-like filtering algorithms. An EKF presents
one such approximation.
The EKF is very commonly used algorithm and, because of its simplicity, is very often chosen as the ”best”
algorithm for implementation.
Because of numbers of significant problems that appear when implementing the EKF, other algorithms such
as:
•
•
•
Iterated Extended Kalman Filter (IEKF)
Unscented Kalman Filter (UKF)
Unscented Particle Filter (UPF)
were implemented, tested and compared with the EKF algorithm.
22
PERFORMANCES OF INERTIAL NAVIGATION AIDED BY BEARING
ONLY SIMULTANEOUS LOCALIZATION AND MAPPING
position errors
60
EKF
IEKF
UKF
UPF
40
y [m]
20
0
-20
400
-40
-60
200
100
0
800
0
10
20
30
40position
50errors60
t[s]
70
80
90
100
30
40position
50errors60
t[s]
70
80
90
100
30
40
70
80
90
100
80
EKF
IEKF
UKF
UPF
60
40
20
z [m]
True trajectory
Un-aided IN trajectory
EKF trajectory
Map point
IEKF trajectory
UKF trajectory
UPF trajectory
0
-20
600
800
600
400
-60
400
200
y [m]
-40
0
10
EKF
IEKF
UKF
UPF
200
0
0
20
60
40
x[m]
20
True estimated and divergent
vehicle trajectories
x [m]
z [m]
300
0
-20
-40
-60
0
10
20
50
t[s]
60
PERFORMANCES OF INERTIAL NAVIGATION AIDED BY BEARING
ONLY SIMULTANEOUS LOCALIZATION AND MAPPING
map point
map point
415
570
410
565
560
405
EKF IEKF
UKF
555
y [m]
x [m]
400
UPF
395
EKF IEKF
550
UKF
390
545
385
540
0
10
20
30
40
50
t[s]
60
70
80
90
535
100
0
10
20
30
40
50
t[s]
60
70
80
90
100
map point
15
10
EKF IEKF
5
0
UPF
z [m]
380
UPF
-5
UKF
-10
-15
-20
-25
-30
0
10
20
30
40
50
t[s]
60
70
80
90
100
Map point estimates
24
CONCLUSIONS

Aiding Inertial navigation (IN) by BOSLAM exhibits a high degree of nonlinearity and typically in these
applications an EKF introduces large estimation errors.

Other algorithms such as UKF and UPF demonstrate best performance and appear to be efficient
estimators for the concept of IN aided by BOSLAM.

The SLAM aided IN and BOSLAM aided IN sensor fusion algorithm present reliable solutions that
provide aiding information to IN from vision sensors. These algorithms successfully integrate the
inertial and vision sensors with no a priori knowledge of the environment.
QUADROTOR RESEARCH ARENA
GRASP Lab, University of Pennsylvania, extensive use of motion capture systems
Vicon T Series motion
capture System
QUADROTOR RESEARCH ARENA
Flying Machine arena , Institute for Dynamic Systems and Control ETH Zurich
QUADROTOR RESEARCH ARENA
Localization and mapping done
with two Hokuyo lidars and a
servo motor. University of
Pennsylvania
CityFlyer project, Robotics and
Intelligent Systems Lab, City
College of New York, City
University of New York.
STARMAC Project in the Hybrid
Systems Lab at UC Berkeley
QUADROTOR RESEARCH ARENA
Robotics Group, CSAIL MIT:
•
•
•
•
•
•
Hokuyo laser range-finder (1),
stereo cameras (2),
monocular color camera (3),
laser-deflecting mirrors for altitude (4),
1.6GHz Intel Atom-based flight computer (5),
Ascending Technologies internal processor and IMU (6).
QUADROTOR RESEARCH ARENA
sFly European Project: Visual-Inertial SLAM for a
small hexacopter (IROS 2012 video screenshot)
Computer Vision Group at TUM Germany:
Autonomous Camera-Based Navigation of a LowCost Quadrocopter
FUTURE WORK POSSIBILITIES
In our future research we are working on developing novel approaches i.e. efficient estimators (sensor fusion
algorithms) which will provide navigation performances and accuracy to a level close to the Differential GPS
using only single video camera and inertial sensors.
Our research work is very much focused on the practical experimentation and validation of the before defined
problems. Quadrotor UAV is the chosen platform for the practical experiments.
Quadrotor Flying Frog
ROBOTILNICA Intelligent
Systems and Robotics
Custom modified quadrotor from KKMulticopter South Korea
FUTURE WORK POSSIBILITIES
We are fascinated by insect based navigation. Flying insects especially inspire much of our research on
autonomous navigation systems.
We are very much interested in how the flying insects adept at maneuvering in complex, unconstrained,
hostile and hazardous environments.
FUTURE WORK POSSIBILITIES
Our approach with the practical experiments presents a low cost approach. Low cost Inertial Measurement Unit
(IMU) from Sparkfun Electronics provides the angular rates (gyro) measurements and accelerometer
measurements. These inertial sensors data together with the video from a smartphone camera are transferred
over WiFi and/or 3G to a remote server/PC for processing. The video processing is performed on the remote PC
using OpenCV libraries, together with the sensor fusion algorithms. The results are transferred back to the
quadrotor for guidance and control of the vehicle for performing the required maneuvers (motion) and the
vehicle task itself.
FUTURE WORK POSSIBILITIES
As with the case of singe UAV, simply passing or flying by a map point with no manoeuvre will not help much for
autonomous navigation for swarm of UAV’s. Coordination and cooperation between the UAV’s performing
manoeuvres over same environment features (map points) are needed. This investigation is expected to address
the issues when low accuracy vision sensors are used on UAV’s. In this scenario the UAV’s can take
measurements not just of the environment but also of each other. It is expected that these measurements can
accelerate the convergence of the novel algorithms under experimentations
THANK YOU FOR YOUR ATTENTION
?
QUESTIONS ARE WELCOMED
35