Transcript ppt

Interruptible Rendering
David Luebke
University of Virginia
But first…

Cat
– More evil use of computer graphics

Then, to cleanse the palette…
– Work in Progress

Industrial Light + Magic, 1999
– After You

4/30/2003
Ringling School of Art & Design, 2003
Interruptible Rendering
2
Interruptible Rendering
David Luebke
Department of Computer Science
University of Virginia
Cliff Woolley
Ben Watson
Abhinav Dayal
Presented at ACM SIGGRAPH 2003
Symposium on Interactive 3D Graphics
Problem: fidelity vs. performance
 Interactive graphics requires trading off
detail vs. frame rate
 Conventional wisdom dictates:
 Maintain a high and constant frame rate
 Thus, use a level of detail that can always be
rendered in time
 This is simplistic! Can we do better?
July 3, 2003
Interruptible Rendering
David Luebke
4
Improve on traditional LOD
 Is a high, constant frame rate good enough?
 Is fidelity ever more important than frame
rate?
 How can this be decided at runtime while
still guaranteeing interactivity?
July 3, 2003
Interruptible Rendering
David Luebke
5
Key observation
 The fidelity/performance tradeoff can be seen in
terms of spatial versus temporal error…
July 3, 2003
Interruptible Rendering
David Luebke
6
Key observation
 The fidelity/performance tradeoff can be seen in
terms of spatial versus temporal error…
…and these should be measured and compared directly!
July 3, 2003
Interruptible Rendering
David Luebke
7
Unified spatial/temporal error
 We combine temporal error (“lateness”) and
spatial error (“coarseness”) into a unified
dynamic visual error metric
 We use this to drive progressive refinement
July 3, 2003
Interruptible Rendering
David Luebke
8
Progressive refinement
 The big question:
At what point does further refinement of the
current frame become pointless?
July 3, 2003
Interruptible Rendering
David Luebke
9
Progressive refinement
 The big question:
At what point does further refinement of the
current frame become pointless?
 Answer: When temporal error exceeds
spatial error
July 3, 2003
Interruptible Rendering
David Luebke
10
Methodology
 Refine a stream of continuous LODs
 Monitor input frequently
 Minimize dynamic visual error
July 3, 2003
Interruptible Rendering
David Luebke
11
Refine a stream of continuous LODs
 Render progressive refinements on top of each
other until “out of time”
 Ensure that we can stop refining at any time and
move on to the next frame
July 3, 2003
Interruptible Rendering
David Luebke
12
Monitor input frequently
 Ideally: input monitored continuously
 Realistically: check every x ms
 Allows quick reaction when sudden changes
in input occur
 Allows system to be self-tuning
July 3, 2003
Interruptible Rendering
David Luebke
13
Minimize dynamic visual error
 Always display image with least error
 Sometimes:
 Further refinement is pointless
 Temporal error (lateness) exceeds spatial error
(coarseness) for the current frame
 The front buffer is “out of date”
 Dynamic visual error (spatial + temporal) in front
is greater than in the back
July 3, 2003
Interruptible Rendering
David Luebke
14
start
Clear the back buffer
Clear the
front buffer
Refine the back buffer
N
tf > sf?
N
t b > s b?
Compute
dynamic
visual error
Y
N
ef >= eb?
Y
Swap buffers
Y
Swap buffers
Rendering to Front Buffer
Rendering to Back Buffer
time
Refine the
front buffer
Refinement
 Need a progressive rendering scheme
 Subtlety: In depth-buffered rendering, you can’t
“unrender” pixels from a coarse model
 Thus for depth-buffered rendering scheme should
satisfy containment
July 3, 2003
Interruptible Rendering
David Luebke
16
Refinement
 Three refinement schemes implemented:



Splatting (Tech sketch, SIGGRAPH 2002)
Progressive polygonal hulls
Progressive ray casting
July 3, 2003
Interruptible Rendering
David Luebke
17
Progressive hulls
 Simplification method by [Sander et al. 2000]
 Record a sequence of constrained edge collapses and
play it back in reverse
 Guarantees containment
V3
V2
Vn
V1
July 3, 2003
Interruptible Rendering
V4
David Luebke
18
Calculating dynamic visual error
 We use a simple screen-space metric
 Spatial error:

Maximum projected error of visible geometry
 Temporal error:


Project points on object and/or bounding volume
Find the maximum screen-space distance any point
has moved since frame began
 More elaborate/correct metrics possible!
July 3, 2003
Interruptible Rendering
David Luebke
19
Progressive ray casting
 Progressive refinement
 Coarse to fine sampling/reconstruction
 Sampling
 Where to shoot the ray
 Adaptive
 Non-adaptive
 Reconstruction
 Producing imagery from samples
July 3, 2003
Interruptible Rendering
David Luebke
20
Sampling
 Coarse to fine - quadtree approach
 Sample every quadtree node's center
 Breadth first traversal
 (Predetermined) random traversal per level
 Ray casting – using OpenRT
 Per ray API ~ 600,000 rays per sec
 Simple shading, shadows and specular highlights
July 3, 2003
Interruptible Rendering
David Luebke
21
(Simplistic) reconstruction
 Placement
 Place splat at center of each
quadtree node
 Shading
 Flat shaded quad covering
the node's screen space
 Alpha-textured quads
(smooth reconstruction)


Quad size = twice of node's
screen space
Texture: Gaussian blob


July 3, 2003
Transparent at edges
Opaque at center
Interruptible Rendering
David Luebke
22
Flat shaded
Alpha textured
Calculating dynamic visual error
 Temporal error
 Computed as in case of polygonal hulls
 Spatial error
 Diagonal length of the largest quadtree node displayed on
the screen
Spatial
error
July 3, 2003
Interruptible Rendering
David Luebke
24
Demo
July 3, 2003
Interruptible Rendering
David Luebke
25
Evaluation: the “Gold Standard”
 Compares an ideal rendering to interactive
approximations
 Ideal rendering: full detail, zero delay
 Interactive approximations are




Unmanaged
Constant fidelity (in pixels)
Constant frame rate (fixed Hz)
Interruptible rendering
July 3, 2003
Interruptible Rendering
David Luebke
26
Frame generation
 Record what the user sees and when
 Generate each approximation offline
 Record actual frames displayed over time
 Account for:




Render time
Double buffering
Frame locking
Rendering into front buffer (interruptible)
July 3, 2003
Interruptible Rendering
David Luebke
27
Comparing frames
Difft = Idealt - Renderedt
 Error measures
 RMS – Root Mean Square error
 Lindstrom's perceptually based error
July 3, 2003
Interruptible Rendering
David Luebke
28
Two test input streams
 Rotation
 Model-centered
 Fixed angular velocity
 User interaction
 Includes both view translation and rotation
 Includes both static and dynamic segments
 Both input streams recorded to files, with
timestamps
July 3, 2003
Interruptible Rendering
David Luebke
29
Interaction sequence: ray caster
Frames:
Ideal
Interruptible
Constant
Fidelity
Unmanaged
33
36
39
42
45
48
51
54
57
60
Video
July 3, 2003
Interruptible Rendering
David Luebke
31
Rotation input stream
error(rms)
error(rms)
Progressive hulls
Interactive input stream
time(secs)
error(rms)
error(rms)
Ray casting
time(secs)
time(secs)
constant frame rate
interruptible
time(secs)
constant fidelity
unmanaged
Benefits
 Accuracy
 Balances spatial and temporal error
 Interactivity
 Even w/ slow renderers like our ray caster
 Or large models
 Self-tuning system
 Adapts to hardware
July 3, 2003
Interruptible Rendering
David Luebke
33
Limitations
 Overdraw in progressive renderer
 Progressive ray caster better here
 Cost of monitoring temporal error
 Rendering immersive models
 Requires reversing the containment criteria for
polygon rendering
 Not a problem for ray caster
July 3, 2003
Interruptible Rendering
David Luebke
34
Ongoing and future work
 Improve implementation:





Textures, normal maps, etc.
Reducing overdraw
Parallelization of ray tracing system (Ben & Joe)
Adaptive (view-dependent) refinement (Ben & Bob)
Ray tracing seems like the way to go…
 Better estimates of temporal & spatial error
 Use color comparisons, e.g. to an up-to-date but
low-resolution ray-traced image
 Use contrast sensitivity function or related ideas
July 3, 2003
Interruptible Rendering
David Luebke
35
To think about
 Ultimately this work is about:
 A principled approach to the fidelity vs. performance
tradeoff
 New ideas on temporal sampling & error
 Can we take these ideas further?
 Frames and framebuffers are overrated
 Can we decouple temporal & spatial sampling?
 When and where to sample?
 How to reconstruct?
July 3, 2003
Interruptible Rendering
David Luebke
36
Next steps
 Motivation: Interactive ray tracing
 From bad joke to old hat in a couple of years
 Lots of work on “how”
 Supercomputer [Parker et al. 99]
 PCs and PC clusters [Wald & Slussalek 2000]
 GPUs [Purcell et al. 2002, Carr & Hart 2002]
 We are interested in “why”
July 3, 2003
Interruptible Rendering
David Luebke
37
Interactive ray tracing
 The big question:
What can you do with an interactive ray-based
renderer that you can’t do with a rasterizer?
July 3, 2003
Interruptible Rendering
David Luebke
38
Interactive ray tracing
 The big question:
What can you do with an interactive ray-based
renderer that you can’t do with a rasterizer?
 Can vary sampling rate across the image
 Focus on sampling edges, high-frequency regions
 Exploit eccentricity, velocity, etc.
July 3, 2003
Interruptible Rendering
David Luebke
39
Interactive ray tracing
 The big question:
What can you do with an interactive ray-based
renderer that you can’t do with a rasterizer?
 Can vary sampling rate across time
 Sample more frequently when things are changing
 Sample more frequently where things are changing
July 3, 2003
Interruptible Rendering
David Luebke
40
Rethinking rendering
 Goal: rethink spatial & temporal strategies
for interactive sampling & reconstruction
 Related work:




Frameless rendering
Just-in-time pixels
Sample reuse
Sparse sample reconstruction
July 3, 2003
Interruptible Rendering
David Luebke
41
Nuts and bolts
 What we’re doing: spatio-temporally
adaptive sampling
 Update samples with higher priority in regions of
higher variance
 Spatial variance: edges
 Temporal variance: motion
 Reconstruction of resulting samples
 The “deep buffer” stores samples in time & space
 Reconstruct image at front edge of time: apply
filter kernel with varying width in space and time
July 3, 2003
Interruptible Rendering
David Luebke
42
Questions going forward
 How best to vary sampling rate to respond
to variance?
 What exactly do we mean by “variance”?
 How best to generate an image from nonuniform samples when some are more
“stale” than others?
 What shape should kernel be, especially in time?
 Can we simulate frameless display
hardware?
July 3, 2003
Interruptible Rendering
David Luebke
43
Acknowledgements
 Peter Lindstrom for ltdiff
 OpenRT Interactive Raytracing Project
 Stanford 3D Scanning Repository
 National Science Foundation
 Awards 0092973, 0093172, and 0112937
July 3, 2003
Interruptible Rendering
David Luebke
44