Optical Flow Methods - University of Delaware

Download Report

Transcript Optical Flow Methods - University of Delaware

Optical Flow Methods
CISC 489/689
Spring 2009
University of Delaware
Outline
• Review of Optical Flow Constraint, LucasKanade, Horn and Schunck Methods
• Lucas-Kanade Meets Horn and Schunck
• 3D Regularization
• Techniques for solving optical flow
• Confidence Measures in Optical Flow
Optical Flow Constraint
( x, y )
( x  ut , y  vt )
f ( x, y , t )
f ( x  ut , y  vt , t  t )  f ( x, y, t )
f
f
f
f ( x, y, t )  x  y  t
 f ( x, y , t )
x
y
t
Dividingby t and takinglimit t 
 0
f dx f dy f


0
x dt y dt t
f xu  f y v  f t  0
f ( x, y, t  t )
Interpretation
v
f xu  f y v  f t  0
Constraint Line
u 
f     f t
v
T
f , f 
x
y
u
f
x
fy
u 
 
ft  v   0
1
 

Lucas-Kanade Method
 f x2

 fx fy
 f x ft

fx fy
f y2
f y ft
 f x2

J   fx fy
 f x ft

f x f t  u 
 
f y f t  v   0
f t 2  1 
fx fy
f y2
f y ft
f x ft 

f y ft 
f t 2 
J
*
u 
 
E LK (u , v)  K  * J  v 
1
 
=
K  J
Lucas-Kanade Method
• Local Method, window based
• Cannot solve for optical flow everywhere
• Robust to noise
  7.5
  15
Figures from Lucas/Kanade Meets Horn/Schunck: Combining Local and Global Optic
Flow Methods ANDR´ES BRUHN AND JOACHIM WEICKERT, 2005
Dense optical Flow
MinimizeE (u, v)   ( f xu  f y v  f t ) 2 ?

Lacks Smoothness
Figures from Lucas/Kanade Meets Horn/Schunck: Combining Local and Global Optic
Flow Methods ANDR´ES BRUHN AND JOACHIM WEICKERT, 2005
Horn and Schunck Method
EHS (u, v)   ( f xu  f y v  f t )   ( u  v )dxdy
2

Euler-Lagrange Equations
2
2
Horn and Schunck Method
• Global Method
• Estimates flow everywhere
• Sensitive to noise
• Oversmooths the edges
  105
  106
Figures from Lucas/Kanade Meets Horn/Schunck: Combining Local and Global Optic
Flow Methods ANDR´ES BRUHN AND JOACHIM WEICKERT, 2005
Why combine them?
• Need dense flow estimate
• Robust to noise
• Preserve discontinuities
Combining the two…
u 
 
E LK (u , v)  K  * J  v 
1
 
EHS (u, v)   ( f xu  f y v  f t ) 2   ( u  v )dxdy
2
2

u 
 
2
2
ECLG (u , v)   K  * J  v    ( u  v )dxdy

1
 
Combined Local Global Method
u 
 
2
2
ECLG (u , v)   K  * J  v    ( u  v )dxdy

Euler-Lagrange Equations  1 
Average
Error
Lucas&Kanade
Standard
Deviation
4.3
(density 35%)
Horn&Schunk
9.8
16.2
Combining local
and global
4.2
7.7
Table: Courtesy - Darya Frolova, Recent progress in optical flow
Preserving discontinuities
• Gaussian Window does not preserve
discontinuities
• Solutions
– Use bilateral filtering
u 
 
2
2
Ebil (u, v)   K bil * J  v    ( u  v )dxdy

1
 
– Add gradient constancy
u 
 
2
2
E grad (u, v)   K  * J  v    ( u  v )   (f  f t 1 ) 2 dxdy

1
Bilateral support window
Images: Courtesy, Darya Frolova, Recent progress in optical flow
Robust statistics – simple example
Find “best” representative for the set of numbers
x
xi
L2:
E

x  xi
2
 min
i
Influence of xi on E:
Enew  Eold  2xi  x   
proportional to
x  xi
Outliers influence the most
x  mean( xi )
L1:
E
xx
i
 min
i
x i → xi + ∆
Enew  Eold  
equal for all xi
Majority decides
x  median( xi )
Slide: Courtesy - Darya Frolova, Recent progress in optical flow
Robust statistics
many ordinary people
a very rich man
wealth
Oligarchy
Votes proportional to the wealth
like in L2 norm minimization
Democracy
One vote per person
like in L1 norm minimization
Slide: Courtesy - Darya Frolova, Recent progress in optical flow
Combination of two flow constraints
min
  I
warped
I
    I
warped
 I

video
I warped  I ( x  u , y  v, t  1) ; I  I ( x, y, t )
usual: L2
x2
robust: L1
robust regularized
x2   2
x
ε
 easy to analyze and minimize
– sensitive to outliers
 robust in presence of outliers
– non-smooth: hard to analyze
 smooth: easy to analyze
 robust in presence of outliers
[A. Bruhn, J. Weickert, 2005]
Towards ultimate motion estimation: Combining highest accuracy withSlide:
real-time
performance
Courtesy - Darya Frolova, Recent progress in optical flow
Robust statistics
3D Regularization
•  ( u  v ) accounted for spatial
regularization
• If velocities do not change suddenly with time,
can we regularize in time as well?
2
2
3D Regularization
u u u
 3u 
 
x y t
v v v
 3v   
x y t
u 
 
2
2
ECLG 3 (u , v)  
K  * J  v    (  3u   3v )dxdydt
X [ 0 ,T ]
1
 
Multiresolution estimation
run iterative estimation
warp & upsample
run iterative estimation
.
.
.
image J1
image
Gaussian pyramid of image 1
image
Image I2
21
Gaussian pyramid of image 2
Multi-resolution Lucas Kanade
Algorithm
Compute Iterative LK at highest level
•For Each Level i
•Take flow u(i-1), v(i-1) from level i-1
•Upsample the flow to create u*(i), v*(i) matrices of twice
resolution for level i.
•Multiply u*(i), v*(i) by 2
•Compute It from a block displaced by u*(i), v*(i)
•Apply LK to get u’(i), v’(i) (the correction in flow)
•Add corrections u’(i), v’(i) to obtain the flow u(i), v(i) at the ith
level, i.e., u(i)=u*(i)+u’(i), v(i)=v*(i)+v’(i)
Comparison of errors
For Yosemite sequence with clouds
Table: Courtesy - Darya Frolova, Recent progress in optical flow
Solving the system
Au  f
How to solve?
Start with some initial guess
u initial
and apply some iterative method
2 components of success:
fast convergence
good initial guess
Relaxation smoothes the error
Relaxation schemes have smoothing property:
............
Only neighboring pixels
are coupled in relaxation
scheme
It may take thousands of
iterations to propagate
information to large
distance
Relaxation smoothes the error
Examples
1D case:
2D case:
Error of initial guess
Error after 5 relaxation
Error after 15 relaxations
Idea: coarser grid
initial grid – fine grid
On a coarser grid low frequencies
become higher
Hence, relaxations can be more
effective
coarse grid – we take every second point
Multigrid 2-Level V-Cycle
1. Iterate ⇒ error becomes smooth
2. Transfer error equation to the coarse
level ⇒ low frequencies become high
5. Correct the previous solution
6. Iterate ⇒ remove interpolation
artifacts
4. Transfer error to the fine level
3. Solve for the error on the coarse level
⇒ good error estimation
Coarse grid - advantages
Coarsening allows:
make iteration process faster (on the coarse grid we can
effectively minimize the error)
obtain better initial guess u initial (solve directly on the coarsest
grid)
go to the
coarsest grid
interpolate u initial
to the finer
grid
solve here the equation
to find
u initial
Au  f
Multigrid approach – Full scheme
Confidence Metric
• Intrinsic in Local Methods
• How to evaluate for global methods?
– Edge strength?
• Doesn’t work (Barron et al.,1994)
Confidence Metric
• Histogram of error contribution
Number of
pixels
Error
Confidence Metric
More Results
More Results
Further Reading
• Combining the advantages of local and global optic flow methods
(“Lucas/Kanade meets Horn/Schunck”) A. Bruhn, J. Weickert, C.
Schnörr, 2002 - 2005
• High accuracy optical flow estimation based on a theory for warping
T. Brox, A. Bruhn, N. Papenberg, J. Weickert, 2004 - 2005
• Real-Time Optic Flow Computation with Variational Methods
A. Bruhn, J. Weickert, C. Feddern, T. Kohlberger, C. Schnörr, 2003 2005
• Towards ultimate motion estimation: Combining highest accuracy
with real-time performance. A. Bruhn, J. Weickert, 2005
• Bilateral filtering-based optical flow estimation with occlusion
detection. J.Xiao, H.Cheng, H.Sawhney, C.Rao, M.Isnardi, 2006