Creating Adaptive Views for Group Video Teleconferencing – An Image-Based Approach Ruigang Yang Andrew Nashel Anselmo Lastra Celso Kurashima Herman Towles Henry Fuchs University of North Carolina at Chapel Hill.

Download Report

Transcript Creating Adaptive Views for Group Video Teleconferencing – An Image-Based Approach Ruigang Yang Andrew Nashel Anselmo Lastra Celso Kurashima Herman Towles Henry Fuchs University of North Carolina at Chapel Hill.

Creating Adaptive Views for Group
Video Teleconferencing
– An Image-Based Approach
Ruigang Yang
Andrew Nashel
Anselmo Lastra
Celso Kurashima
Herman Towles
Henry Fuchs
University of North Carolina
at Chapel Hill
Current Teleconferencing
Transport
Capture
Display
?
International Workshop on Immersive Telepresence 2002
Slide 2
The Office of the Future
International Workshop on Immersive Telepresence 2002
Slide 3
Group Teleconferencing
Cameras
International Workshop on Immersive Telepresence 2002
• Multiple persons
(3-4) at each site
• Life-size,
monoscopic
display
• High-resolution
seamless
imagery
• Active view
control
Slide 4
Active View Control
Provide the best approximating view
International Workshop on Immersive Telepresence 2002
Slide 5
Active View Control
A view synthesis problem
• Extract 3D geometry from a
few cameras
– Less expensive
– Hard to get good results
• Image-based method:
capture many images
– Looks really good on every
scene
– Need many images
International Workshop on Immersive Telepresence 2002
Slide 6
Our Image-based Approach
• Observation:
Eye level remains relatively the same
during a conference session
• A compact Light Field representation
– Parameterized by a 3D function (s, u, v)
t
v
Focal Plane
s
International Workshop on Immersive Telepresence 2002
u
Slide 7
Linear Light Field
International Workshop on Immersive Telepresence 2002
Slide 8
LLF Rendering
• Projective Texture mapping and blending
– Tessellate the focal plane
– Project input images onto the focal plane
– View-dependent blending
New view
Focal Plane
Base image
International Workshop on Immersive Telepresence 2002
Slide 9
Blending Function
   i2
ˆ i  exp 

 2
  
ˆ

  exp




ˆ i
i 
ˆ
ˆj



j
 ˆ
Focal Plane
V
2
i
2
i
0
N-1
1 i




i
i
j
C0
C1
Ci
j
CN-1
D
International Workshop on Immersive Telepresence 2002
Slide 10
Samples Images
Perspective
Projection
Orthogonal
Projection
(extreme case)
International Workshop on Immersive Telepresence 2002
Slide 11
Sampling Analysis
• Configuration parameters
– Focal plane depth D
– Camera’s FOV
– Camera’s horizontal
resolution W
– Inter-camera distance d
• Error term: pixel drift (e)
Given the configuration parameters,
and a desired error tolerance e , what
is the maximum depth deviation D
from the optimal depth D.
International Workshop on Immersive Telepresence 2002
Slide 12
Sampling Analysis – Result
International Workshop on Immersive Telepresence 2002
Slide 13
More results
• Distributed System
– 11 cameras
– 6 capture PC ( 640x480)
• ROI encoded
• JPEG compression
– One rendering PC
• Roughly 1000 x 480
output
• 4-7 frames per second
International Workshop on Immersive Telepresence 2002
Slide 14
Conclusions
We presented a novel system designed
specifically for group videoteleconferencing.
• Best approximate view for the group
• Photo-realistic results at interactive rate
• Flexible and scaleable
International Workshop on Immersive Telepresence 2002
Slide 15
Acknowledgements
Funding support from
The Department of Energy's ASCI VIEWS Program
Sandia National Laboratories USA
Collaborators from Sandia
Phil Heermann
Christine Yang
Corbin Stewart
International Workshop on Immersive Telepresence 2002
Slide 16
The End
Thank You
International Workshop on Immersive Telepresence 2002
Slide 17