Multi-Projector Displays using Camera-based Registration Ramesh Raskar, Michael S. Brown, Ruigang Yang, Wei-Chao Chen, Greg Welch, Herman Towles, Brent Seales, Henry Fuchs University of North.

Download Report

Transcript Multi-Projector Displays using Camera-based Registration Ramesh Raskar, Michael S. Brown, Ruigang Yang, Wei-Chao Chen, Greg Welch, Herman Towles, Brent Seales, Henry Fuchs University of North.

Multi-Projector Displays using
Camera-based Registration
Ramesh Raskar, Michael S. Brown,
Ruigang Yang, Wei-Chao Chen, Greg Welch,
Herman Towles, Brent Seales, Henry Fuchs
University of North Carolina
at Chapel Hill
Multi-Projector Panoramic
Displays
Traditional Display Setups
Multi-Projector Panoramic
Displays
Traditional Display Setups
Precise Geometry, Well-defined overlaps
Generalized Panoramic Display
Flexible Display Setups
Casually aligned projectors and screens
Problem :
• Traditionally rigid design
– Painstaking construction
 Room must be modified
 Projector alignment is tedious
– High maintenance cost
– Constant adjustments
 Projectors become “unaligned” over time
Problem :
Instant Panoramic Display
• Generalized display environment
• Setup display surface approximately
• Align projectors casually
• Render with what you have
Multi-Projector Goal
• Seamless imagery
• Immersive 3D images, moving user
• Casually setup projectors and surfaces
• Low maintenance
• Auto-Calibration
• Perceptually single logical projector
Goal :
Seamless Display
(a) Projector Registration
– Geometric alignment
(b) Intensity blending
– Smooth transition
Projection 1
Projection 2
Goal :
Seamless Display
(a) Projector Registration
– Geometric alignment
(b) Intensity blending
– Smooth transition
Goal :
Seamless Display
(a) Projector Registration
– Geometric alignment
(b) Intensity blending
– Smooth transition
Goal :
Seamless Display
(a) Projector Registration
– Geometric alignment
(b) Intensity blending
– Smooth transition
Goal :
Seamless Display
(a) Projector Registration
– Geometric alignment
(b) Intensity blending
– Smooth transition
Previous approaches
Multi-projector setups
• Sweet spot
– Static user
• Moving User
– Head-tracked user
Previous approaches:
Sweet spot (static user)
• Multiple projectors
– (well-defined overlaps)
– Flight sims,
– Panoram, Trimensions
– [Raskar98]
• Single wide-field-of-view projector
– OmniMax
– Alternate Realities
Previous approaches:
Moving user
• CAVE
– Rigid setup
– Well-defined geometry
– Large support structure
Previous approaches:
Moving user
• Office of the Future [Raskar98]
– Relaxed construction constraints
– Assumes precise display geometry recovery
– Demonstrated for a single projector
Related work:
Image Mosaicing
• QuickTime VR
• Pair-wise mosaicing of photographs
• Generalized mosaicing
Goal for this paper
• Seamless image
• Immersive 3D images, moving user
• Arbitrary projectors and surfaces
• Eliminate maintenance
– Self-calibration
• Perceptually single logical projector
Basic Approach: Panoramic
Display
• Position projectors approximately
• Use video cameras to recover
– display geometry
– projector configuration
• Track user
• Render images of 3D scenes
• Warp and blend projected images to achieve
registration
Outline
• Single projector
– Calibration, Display recovery, Rendering
• Multiple projector
– Registration, Blending
• New techniques
– Post rendering warp, mesh unification
Video
Outline
• Single projector
– Rendering, Calibration, Display recovery
• Multiple projector
– Registration, Blending
• New techniques
– Post rendering warp, mesh unification
2-pass Rendering
Projector
(2) Render Model
(1) Desired Image
User
2-pass Rendering
Project this “warped” image, the viewer will see it
correctly
Projector
(2) Render Model
(1) Desired Image
User
Single projector display
2-pass unknowns
• Arbitrary display surface
• Arbitrary projector location
– Need to determine these for 2 pass rendering
- How?
Stereo camera pair
User
Projector ?
Single projector display
Camera calibration
1. Calibrate stereo pair
Stereo camera pair
Projector ?
Single projector display
Display surface estimation
1. Calibrate stereo pair
2. Determine display surface
Stereo camera pair
Projector ?
Single projector display
Projector estimation
1. Calibrate stereo pair
2. Determine display surface
3. Determine projector
Stereo camera pair
Projector ?
Single projector display
Projector estimation
1. Calibrate stereo pair
2. Determine display surface
3. Determine projector
Stereo camera pair
Projector
Single projector display
Display Surface Estimation
• Display surface is recovered by cameras (via projector)
– Surface mesh is created from the 3D points
– Allows us to calibrate the projector
• Approximate speed
– 720 surface point samples (~1 minute)
– 14,000 surfaces point samples (~15 minutes)
Single projector display
Coordinate Frame
• Display surface and projector are in the coordinate
frame of the “Calibration Pattern”
x
y
z
Multiple Projectors
Panoramic display environments
• A single stereo-pair cannot see entire display
– Must use several stereo-pairs
– Each in its own coordinate frame
– Must register data into common coordinate frame
• Advantage
– We control the light
– Makes registration easier
Multiple Projectors
Surface mesh registration
Multiple Projectors
Display surfaces are in different coordinate
frames
3
2
x
y
z
1
Multiple Projectors
Must register into common coordinate frame
3
2
x
y
z
1
Multiple Projectors
Surface meshes are registered into the same
coordinate frame
3
2
x
y
z
1
Multiple Projectors
Re-calibrate projectors so they are registered
3
2
x
y
z
1
Re-calibrate the
projectors so they are
registered into a common
coordinate frame
Multiple Projectors
Re-calibrate projectors so they are registered
3
2
x
y
z
1
With display surfaces
and projectors registered
we can achieve
geometric correctness . .
.
Multiple Projectors
Intensity Blending
Need to attenuate
intensities in the
overlapped
region
Intensity Blending
Use camera to find overlaps
Camera observes
overlapped region
Intensity Blending
Assigning intensity weights (alpha)
Camera image plane
Assign intensity weights based on
distance between projector boundaries
Intensity Blending
Assigning intensity weights (alpha)
Camera image plane
Assign intensity weights based on
distance between projector boundaries
Intensity Blending:
Alpha-masks
2
1
3
1
2
3
Projector alpha-masks
Achieving our goal
Seamless display
• Display surface and projector registration
– Geometrically alignment imagery
• Intensity blending
– Smooth intensity transition between projectors
Registration Issues
• Errors in estimation
– user location
– 3D display surface
– projector parameters
Registration Issues
• Errors in estimation
– user location
– 3D display surface
– projector parameters
• Single projector, errors not visible
• Multi-projector overlap
– Geometric mis-registrations
– Gaps, Discontinuities
Sources of errors
• Head-tracker for user location
• Display surface recovery
– Camera Calibration
 Radial Distortion
 Pattern Imperfection
– Feature detection
• Projector calibration
– Relies on display surface recovery
Re-projection error
M
Projector
framebuffer
m'
m
m'
m
P
P
P’
Sources of re-projection error
• Minimization residual
• Projector not pin-hole device
• Inaccuracies in estimated 3D pts on display surface
Effect of errors
• User location
• Display surface meshes
 difficult to register
 incorrect surface representation
• Projector parameters
 re-projection error
 incorrect projection matrix
Geometric Error Compensation
• Objective
 Neighboring projectors use same display
surface representation
 Each projection matrix maps 3D screen pts to
2D pixels that illuminate them
• Overcoming inaccuracies
 Geometric continuity everywhere for display
 Fidelity of projector model
Outline
• Single projector
– Calibration, Display recovery, Rendering
• Multiple projector
– Registration, Blending
• New techniques
– Post rendering warp, mesh unification
Geometric Error Compensation
• Objective
 Neighboring projectors use same display
surface representation
Surface Mesh Unification
Each projection matrix maps 3D screen pts to 2D
pixels that illuminate them
Post Rendering Warp
Surface mesh unification
• Merge multiple recovered meshes
• Create a single representation of display surfaces
• Rigid transformation creates near alignment
• Errors lead to discontinuities
• Use smooth 3D transition
Surface mesh unification
Actual
surface
x
y
z
x
y
z
P2
P1
Surface mesh unification
Surface mesh unification
Surface mesh unification
Surface Unification
New surface
mesh
Surface Unification
Original surface
New surface
mesh
Both projectors use
same (incorrect)
surface geometry
Geometric Error Compensation
• Objective
 Neighboring projectors use same display
surface representation
Surface Mesh Unification
 Each projection matrix maps 3D screen pts to
2D pixels that illuminate them
Post Rendering Warp
Modified re-projection error
Actual Pt
M
M’ Estimated Pt
P2
P1
Modified re-projection error
M
M’
P2’
P2
P1’
P1
Modified re-projection error
M
M’
m2
m2
m1
P2’
P2
m1
P1’
P1
Modified re-projection error
M
M’
m2’’
m2’’
m2
m2
m1
m1’’
P2’
P2
m1’’
m1
P1’
P1
Effect of reprojection error
• Physically m1 and m2 create image at M
• In estimated model, m1’’ and m2’’ correspond
• Features of a virtual object gets rendered at m1’’ and
m2’’
• But, m1 and m2 are not likely to correspond
Effect of Re-projection error
V
M
User
M’
V
m1’’
m1
M
m2’’
m2
Eliminating re-projection error
• We have
 m1’’ = P1’ M’
 m2’’ = P2’ M’
V
m1’’
m1
M
m2’’
m2
Eliminating re-projection error
• We have
 m1’’ = P1’ M’
 m2’’ = P2’ M’
V
m1’’
m1
M
• We need a nonlinear projective function
 m1 = F1 (M’)
 m2 = F2 (M’)
m2’’
m2
Eliminating re-projection error
• We have
 m1’’ = P1’ M’
 m2’’ = P2’ M’
V
m1’’
m1
M
m2’’
m2
• We need a nonlinear projective function
 m1 = F1 (M’)
 m2 = F2 (M’)
• Cannot be achieved using single projection matrix
• Lack of hardware support
Post Rendering warp
Goal:
 m1 = F1(M’) m2 = F2(M’)
(a) Render using estimated projection matrix
 m1’’ = P1’ M’
m2’’ = P2’ M’
(b) Warp resultant image
 m1’’ --> m1
m2’’ --> m2
Post Rendering warp
Goal:
 m1 = F1(M’) m2 = F2(M’)
(a) Render using estimated projection matrix
 m1’’ = P1’ M’
m2’’ = P2’ M’
(b) Warp resultant image
 m1’’ --> m1
m2’’ --> m2
• Expensive to specify per-pixel warp
 Use piecewise planar approximation
Post Rendering warp
(using texture mapping)
• Define grid of pixels in projector framebuffer (m)
• Project and record 3D surfaces pts (M)
• After surface unification (M’),
 find reprojected points (m’’)
• Tesselate the framebuffer
• Load rendered image in texture memory
• Render vertices m with texture coordinates as m’
Post Rendering warp
(using texture mapping)
Record 3D pts
Project pixel grid
Find reprojected pixels
Post Rendering warp
(using texture mapping)
Compute required
post-rendering warp
Summary of Techniques
Basic Approach
• At each projector
– calibrate camera pair
– project dots and find 3D display surface
– (find surface illuminated by neighboring projectors)
– find projector parameters
• Find intensity blending functions
• Register surface meshes (near alignment)
Summary of Techniques
Compensating the errors
• Unify meshes using smooth 3D transition
 Use ‘camera overlap’ as transition region
• Re-compute all best-fit projector matrices
• Find projective function
 piecewise planar approximation using multiple
triangles
• Specify all rendering tasks using a display list
Summary of Techniques
• At run time, at each projector
 Read current user location
 Render virtual object using two-pass rendering
 Post-rendering 2D warp
 Intensity blending
Restricted cases
• Static User (Sweet spot)
– VSMM98
• Planar display surface
– Tech report
• Available at project web page
Features
• Task of each projector is independent
• 2-pass rendering, post-rendering warp is pre-defined
 Can be specified using display list
• Traditional hardware
Implementation
• 5 Single lens 1024x768 projectors
• Multiple 640x480 cameras
• SGI InfiniteReality2 for each projector
• Matlab for calibration, OpenGL for rendering
Issues
• Resampling artifacts due to limited resolution
• Color matching
• Viewer-dependent parameters
• Limited depth-of-field
Acknowledgements
• Office of the Future group
• NSF STC
• NTII (Advanced Network Systems)
• Intel
Summary
• Multiple projectors to create panoramic imagery
• Geometric registration is major challenge
• Video cameras to create 3D representation
• Generalized display configuration
• www.cs.unc.edu/~stc