Multiple View Geometry in Computer Vision

Download Report

Transcript Multiple View Geometry in Computer Vision

Structured light and
active ranging techniques
Class 8
Geometric Computer Vision course schedule
(tentative)
Lecture
Exercise
Sept 16
Introduction
-
Sept 23
Geometry & Camera model
Camera calibration
Sept 30
Single View Metrology
Measuring in images
(Changchang Wu)
Oct. 7
Feature Tracking/Matching
Correspondence computation
Oct. 14
Epipolar Geometry
F-matrix computation
Oct. 21
Shape-from-Silhouettes
Visual-hull computation
Oct. 28
Stereo matching
papers
Nov. 4
Stereo matching (continued)
Project proposals
Nov. 11
Structured light and active range sensing
Papers
Nov. 18
Structure from motion and visual SLAM
Papers
Nov. 25
Multi-view geometry and self-calibration
Papers
Dec. 2
3D modeling, registration and
range/depth fusion (Christopher Zach?)
Papers
Dec. 9
Shape-from-X and image-based rendering
Papers
Dec. 16
Final project presentations
Final project presentations
Today’s class
• unstructured light
• structured light
• time-of-flight
(some slides from Szymon Rusinkiewicz, Brian Curless)
A Taxonomy
A taxonomy
Unstructured light
project texture to
disambiguate stereo
Space-time stereo
Davis, Ramamoothi, Rusinkiewicz, CVPR’03
Space-time stereo
Davis, Ramamoothi, Rusinkiewicz, CVPR’03
Space-time stereo
Zhang, Curless and Seitz, CVPR’03
Space-time stereo
Zhang, Curless and Seitz, CVPR’03
• results
Triangulation
Triangulation: Moving the
Camera and Illumination
• Moving independently leads to problems
with focus, resolution
• Most scanners mount camera and light
source rigidly, move them as a unit
Triangulation: Moving the
Camera and Illumination
Triangulation: Moving the
Camera and Illumination
(Rioux et al. 87)
Triangulation: Extending to 3D
• Possibility #1: add another mirror (flying spot)
• Possibility #2: project a stripe, not a dot
Object
Laser
Camera
Triangulation Scanner Issues
• Accuracy proportional to working volume
(typical is ~1000:1)
• Scales down to small working volume
(e.g. 5 cm. working volume, 50 m. accuracy)
• Does not scale up (baseline too large…)
• Two-line-of-sight problem (shadowing from
either camera or laser)
• Triangulation angle: non-uniform resolution if
too small, shadowing if too big (useful range:
15-30)
Triangulation Scanner Issues
•
•
•
•
•
Material properties (dark, specular)
Subsurface scattering
Laser speckle
Edge curl
Texture embossing
Space-time analysis
Curless ‘95
Space-time analysis
Curless ‘95
Projector as camera
Multi-Stripe Triangulation
• To go faster, project multiple stripes
• But which stripe is which?
• Answer #1: assume surface continuity
e.g. Eyetronics’ ShapeCam
Real-time system
Koninckx and Van Gool
Real-time scanning system
Rusinckiewicz et al. Siggraph02
Szymon Rusinckiewicz talk Friday 20/11 at 11:15 in ETZ E9
(in context of PhD defense Thibaut Weise,
also 20/11 at 15:00 in ETF C109)
In-hand modeling
Weise et al. CVPR08
Multi-Stripe Triangulation
• To go faster, project multiple stripes
• But which stripe is which?
• Answer #2: colored stripes (or dots)
Multi-Stripe Triangulation
• To go faster, project multiple stripes
• But which stripe is which?
• Answer #3: time-coded stripes
Time-Coded Light Patterns
• Assign each stripe a unique illumination code
over time [Posdamer 82]
Time
Space
An idea for a project?
Bouget and Perona, ICCV’98
Pulsed Time of Flight
• Basic idea: send out pulse of light (usually laser),
time how long it takes to return
1
d  ct
2
Pulsed Time of Flight
• Advantages:
• Large working volume (up to 100 m.)
• Disadvantages:
• Not-so-great accuracy (at best ~5 mm.)
• Requires getting timing to ~30 picoseconds
• Does not scale with working volume
• Often used for scanning buildings, rooms,
archeological sites, etc.
Depth cameras
2D array of
time-of-flight sensors
e.g. Canesta’s CMOS 3D sensor
jitter too big on single measurement,
but averages out on many
(10,000 measurements100x improvement)
Depth cameras
3DV’s Z-cam
Superfast shutter +
standard CCD
• cut light off while pulse is
coming back, then I~Z
• but I~albedo (use
unshuttered reference view)
AM Modulation Time of Flight
• Modulate a laser at frequencym , it returns with
a phase shift 
1  c    2n 
d   

2  νm  2

• Note the ambiguity in the measured phase!
 Range ambiguity of 1/2mn
AM Modulation Time of Flight
• Accuracy / working volume tradeoff
(e.g., noise ~ 1/500 working volume)
• In practice, often used for room-sized
environments (cheaper, more accurate than
pulsed time of flight)
Shadow Moire
Depth from focus/defocus
Nayar’95
Next class: structure from motion