Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen

Download Report

Transcript Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen

CPSC 643, Presentation 4

Visual Odometry for Ground Vehicle Applications

David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ 08530

Journal of Field Robotics.

Vol. 23, No. 1, pp. 3-200, 2006.

Mostly Related Works

• • • •

Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004

Mostly Related Works

• • • •

Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004

• • Integrate with IMU/GPS Bundle Adjustment

Mostly Related Works

• • • •

Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004

• • 3-DOF Camera Optical Flow Method

Mostly Related Works

• • • •

Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004

• • • • With absolute orientation sensor Forstner interest operator in the left Image, matches from left to right Use approximate prior knowledge Iteratively select landmark points

Mostly Related Works

• • • •

Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004

• • Estimates ego-motion using a hand-held Camera Real-time algorithm based on RANSIC

Other Related Works

• •

Simultaneous Localization and Mapping (SLAM) Robot Navigation with Omnidirectional Camera

Motivation

Nister: Use pure visual information Use Harris corner detection in all images, track feature to feature No prior knowledge RANSIC based estimation in real time Olson: With absolute orientation sensor Forstner interest operator in the left Image, matches from left to right Use approximate prior knowledge Iteratively select landmark points

Feature Detection

Harris Corner Detection

Search for the local maxima

d t x y

max

Feature Detection

• • • •

Four Sweeps to Calculate

x x

,

x y

,

y y

[ 1, 0,1] [ 1, 0,1] [1, 4, 6, 4,1]

T T

 1  1 0 1 1 1 4 6 4 1 1 4 6 4 1

Feature Detection

Detected Feature Points

Superimposed feature tracks through images

Feature Matching

• •

Two Directional Matching

Calculate the normalized correlation

I

1

I

2 max in reign,

n n

I

1 2

n

2  (  

I I

1 2

I

1 ) 2   

n I

 1

I

2 2 

I

2 ( 

I

2 ) 2  11 Match the feature points in the circular area that have the maximum correlation in two directions .

Robust Estimation

• • • • •

The Monocular Scheme

Separate the matching point s into 5-points groups.

Treat each group as a 5-poi nt relative pose problem .

Use RANSAC to select well matched groups.

Estimate camera motion us ing the selected groups.

Put the current estimation in to the coordinate of the prev ious one.

Robust Estimation

• •

The Stereo Scheme

Match the feature points in stereo images, then triangulate them into 3D points .

Estimation the camera motion using RANSAC and the 3D points in consecutive frames.

3D point

Experiments

Different Platforms

Experiments

Speed and Accuracy

Experiments

Visual Odometry vs. Differential GPS

Experiments

Visual Odometry vs. Inertial Navigation System (INS)

Experiments

Visual Odometry vs. Wheel Recorder

Conclusion and Future Work

• • •

Conclusion A real-time ego motion estimation system.

Work both on monocular camera and stereo head.

Results are accurate and robust .

• •

Future Work Integrate visual odometry with Kalman filter .

Use sampling methods with multimodal distributions .