Inverse Rendering Methods for Hardware-Accelerated Display of Parameterized Image Spaces Ziyad S. Hakura.

Download Report

Transcript Inverse Rendering Methods for Hardware-Accelerated Display of Parameterized Image Spaces Ziyad S. Hakura.

Inverse Rendering Methods for
Hardware-Accelerated Display of
Parameterized Image Spaces
Ziyad S. Hakura
Real-time rendering of Parameterized Image Spaces
with Photorealistic Image Quality
Object
Motion
Viewpoint Position
•Animation time parameter
•Viewpoint parameter along circle
Cockpit
Lighting
Day light
Night sky
Interactive
Toy Story
•Limited viewpoint motion
•Head motion parallax puts the user “in the scene”
•Character parameters e.g. happiness/sadness
•Rose98, Gleicher98, Popović99
Parameterized Image Spaces
•Space can be 1D, 2D or more
•Content author specifies parameters
Object motion
Light motion
Viewpoint position
Interactive Motion
View
Light
An interactive user is free to move
anywhere in the parameter space
Ray-Tracing
Texture-Mapping
Graphics Hardware
Ray Tracing
Display
Eye
Z-Buffer Graphics Hardware
Display
Eye
Pixel Fill Rate vs. Time
1E+9
Pixel Fill Rate
800E+6
600E+6
400E+6
200E+6
000E+0
Apr-97 Oct-97 Apr-98 Oct-98 Apr-99 Oct-99 Apr-00 Oct-00
Time
Overall Model
Render Ray-Traced
Images Offline
PREPROCESS
Encode images in terms of
3D Graphics Primitives
Decode images using
Graphics Hardware
RUN-TIME
Overall Model
Render Ray-Traced
Images Offline
PREPROCESS
Encode images in terms of
3D Graphics Primitives
Decode images using
Graphics Hardware
RUN-TIME
Overall Model
Render Ray-Traced
Images Offline
PREPROCESS
Encode images in terms of
3D Graphics Primitives
Decode images using
Graphics Hardware
RUN-TIME
Related Work
•Hardware Shading Models
•Diefenbach96, Walter97, Ofek98, Udeshi99, Cabral99,
•Kautz99, Heidrich99
•Image-Based Rendering (IBR)
•Chen93, Levoy96, Gortler96, Debevec96, Shade98,
•Miller98, Debevec98, Bastos99, Heidrich99, Wood00
•Animation Compression
•Guenter93, Levoy95, Agrawal95, Cohen-Or99
Contributions
•Inverse rendering method for inferring texture maps
•Hardware-accelerated decoding of compressed
parameterized image spaces
•Parameterized environment maps representation
for moving away from pre-rendered samples
•Hybrid rendering for refractive objects
Outline
•Motivation
•Texture Inference
•Parameterized Texture Compression
•Parameterized Environment Maps
•Hybrid Rendering
•Conclusion
Consider a Single Image
p2
p1
Parameterized
Image Space
Single Image
How do we represent the shading on each object?
Texture Mapping
+
3D Mesh
=
2D Texture
2D Image
Texture Inference by
Inverse Rendering
+
3D Mesh
=
2D Ray-Traced
Image
2D Texture
Linear Hardware Model
A
HW Filter
Coefficients
x =
Unknown
Texture Pixels
Texture
b
Ray-Traced
Image
Screen
Hardware
Render
Texture Inference
Ax
x = texture values
b = ray-traced image
|Ax – b +
2
r(x)|
Forward Mapping Method
........
........
........
........
........
........
........
........
Ray-Traced
Inverse Fitted
Forward Mapped
PSNR=41.8dB
PSNR=35.4dB
Outline
•Motivation
•Texture Inference
•Parameterized Texture Compression
•Parameterized Environment Maps
•Hybrid Rendering
•Conclusion
Parameterized Texture
Compression
p2
V
U
View
p1
Why compress textures
instead of images?
•Textures better capture coherence
•Independent of where in image object appears
•Object silhouettes correctly rendered from geometry
•Viewpoint can move away from original samples
•No geometric disocclusions
Laplacian Pyramid
8x8
4x4
2x2 1x1
level 1
level 2 level 3
level 0
64 images 16 images 4 images 1 image
Adaptive Pyramid
8
x
2
8x8
4x1
2x2
4x4
1x1
1x4
2x8
level 0
level 1
level 2 level 3
MPEG-Image, 355:1
Laplacian-Texture, 379:1
PSNR=36.8dB
PSNR=38.7dB
Runtime System
•Decompresses texture images
•Caches uncompressed textures in memory
•Textures in top of pyramid likely to be re-used
•Generates rendering calls to graphics system
Outline
•Motivation
•Texture Inference
•Parameterized Texture Compression
•Parameterized Environment Maps
•Hybrid Rendering
•Conclusion
How to handle reflective objects?
Problem: Movement away from pre-rendered
views gives a pasted-on look
Solution: Parameterized Environment Maps
Static Environment Maps (EMs)
Reflection
Ray
N
Eye
Generated using standard techniques:
•Photograph a physical sphere in an environment
•Render six faces of a cube from object center
Problem with Static EM
Ray-Traced
Static EM
Self-reflections are missing
Parameterized
Environment Maps (PEM)
EM1
EM2
EM3
EM4
EM5
EM6
EM7
EM8
Environment Map Geometry
EM Geometry
EM Texture
Mapping
Reflection
Ray
(u,v)
N
EM Texture
Eye
Why Parameterize
Environment Maps?
•Captures view-dependent shading in environment
•Accounts for geometric error due to approximation
of environment with simple geometry
Surface Light Fields [Miller98,Wood00]
Surface Light Field
Dense sampling over
surface points of
low-resolution lumispheres
PEM
Sparse sampling over
viewpoints of
high-resolution EMs
Layering of Paramaterized
Environment Maps
distant EM
reflector
local EM
Segment environment into local and distant maps
•Allows different EM geometries in each layer
•Supports parallax between layers
Segmented, Ray-Traced Images
Distant
Local Color Local Alpha
Fresnel
EMs are inferred for each layer separately
Inferred EMs per Viewpoint
Distant
Local
Color
Local
Alpha
Experimental Setup
•1D view space
•1˚ separation between views
•100 sampled viewpoints
Ray-Traced vs. PEM
Closely match local reflections like self-reflections
Movement Away from
Viewpoint Samples
Ray-Traced
PEM
Layered PEM vs.
Infinite Sphere PEM
distant EM
Reflection Ray
Direction
reflector
N
local EM
Eye
Layered PEM
Infinite Sphere PEM
Outline
•Motivation
•Texture Inference
•Parameterized Texture Compression
•Parameterized Environment Maps
•Hybrid Rendering
•Conclusion
How to handle refractive objects?
Problem: Outgoing ray direction hard to predict
from first surface intersection
N
Eye
Refractive
Path
Solution: Hybrid Rendering
Outgoing ray
Hybrid
Rendering
Hybrid Rendering
Texture Mapping
Graphics Hardware
Ray-Tracing
Hybrid Rendering
•Greedy Ray Path Shading Model
•Adaptive Tessellation
•Layered, Parameterized Environment Maps
Greedy Ray Path Shading Model
Reflective
Path
Refractive Object
N
Eye
Refractive
Path
Trace two ray paths until rays exit refractive object
Comparison of Shading Models
Full ray tree
Two-term greedy
ray path
Adaptive Tessellation
•Two criteria:
•Ray path “topology”
•Outgoing ray distance
•Consider both terms of shading model
Layered EMs
local refractive
object
...
EM1
...
EM2
...
EM3
...
EM4
...
EM5
...
EM6
inferred EMs per viewpoint
...
EM7
...
EM8
Inferred EMs
L1
L2
L3
Reflection Refraction
Term
Term
Inferred Environment Maps
Ray-Traced vs. Hybrid
Ray-Traced
480 sec/frame
Hybrid Rendered
19 sec/frame
Benefit of Hybrid Rendering
over Ray-Tracing
•Lower cost
•Adaptive ray-tracing algorithm
•Lower cost and higher predictability
•Greedy two-term shading model
•Substitute environment with layered shells
Outline
•Motivation
•Texture Inference
•Parameterized Texture Compression
•Parameterized Environment Maps
•Hybrid Rendering
•Conclusion
Conclusion
•Provide photorealistic rendering of parameterized
image spaces
•Texture inference by Inverse Rendering
•Parameterized Texture Compression
•Parameterized Environment Maps
•Hybrid Rendering
Recommendations for
Graphics Hardware
•Decompression of textures in hardware
•Compression algorithms
•Decoding from parameter-dependent texture blocks
•More dynamic range in texture pixels
•Ray-tracing for local models
Future Work
•More sophisticated models for hardware rendering
•e.g. fitting area light sources
•Effect of hybrid rendering on compression
•More efficient pre-rendering of ray-traced images
•Multi-dimensional Ray-Tracing
•Higher dimensions
Acknowledgements
•Bernard Widrow
Acknowledgements
•Bernard Widrow
•John Snyder
Acknowledgements
•Bernard Widrow
•John Snyder
•Anoop Gupta
Acknowledgements
•Bernard Widrow
•John Snyder
•Anoop Gupta
•Pat Hanrahan
Acknowledgements
•Bernard Widrow
•John Snyder
•Anoop Gupta
•Pat Hanrahan
•Jed Lengyel, Turner Whitted, and others at Microsoft Research
Acknowledgements
•Bernard Widrow
•John Snyder
•Anoop Gupta
•Pat Hanrahan
•Jed Lengyel, Turner Whitted, and others at Microsoft Research
•Graphics Friends at Stanford
Acknowledgements
•Bernard Widrow
•John Snyder
•Anoop Gupta
•Pat Hanrahan
•Jed Lengyel, Turner Whitted, and others at Microsoft Research
•Graphics Friends at Stanford
•Administrators
•John Gerth, Charles Orgish, Kevin Colton
•Darlene Hadding, Ada Glucksman, Heather Gentner
Acknowledgements
•Bernard Widrow
•John Snyder
•Anoop Gupta
•Pat Hanrahan
•Jed Lengyel, Turner Whitted, and others at Microsoft Research
•Graphics Friends at Stanford
•Administrators
•John Gerth, Charles Orgish, Kevin Colton
•Darlene Hadding, Ada Glucksman, Heather Gentner
•Friends
•Ulrich Stern, Ravi Soundararajan, Kanna Shimizu,
•Gaurishankar Govindaraju, Luke Chang, Johannes Helander
Acknowledgements
•Bernard Widrow
•John Snyder
•Anoop Gupta
•Pat Hanrahan
•Jed Lengyel, Turner Whitted, and others at Microsoft Research
•Graphics Friends at Stanford
•Administrators
•John Gerth, Charles Orgish, Kevin Colton
•Darlene Hadding, Ada Glucksman, Heather Gentner
•Friends
•Ulrich Stern, Ravi Soundararajan, Kanna Shimizu,
•Gaurishankar Govindaraju, Luke Chang, Johannes Helander
•Mother and Sisters Dima and Dalia
Acknowledgements
•Bernard Widrow
•John Snyder
•Anoop Gupta
•Pat Hanrahan
•Jed Lengyel, Turner Whitted, and others at Microsoft Research
•Graphics Friends at Stanford
•Administrators
•John Gerth, Charles Orgish, Kevin Colton
•Darlene Hadding, Ada Glucksman, Heather Gentner
•Friends
•Ulrich Stern, Ravi Soundararajan, Kanna Shimizu,
•Gaurishankar Govindaraju, Luke Chang, Johannes Helander
•Mother and Sisters Dima and Dalia
•Father
END