Spectral Matting Anat Levin1,2 Alex Rav-Acha1 Dani Lischinski1 1School of CS&Eng The Hebrew University 2CSAIL MIT.

Download Report

Transcript Spectral Matting Anat Levin1,2 Alex Rav-Acha1 Dani Lischinski1 1School of CS&Eng The Hebrew University 2CSAIL MIT.

Spectral Matting
Anat Levin1,2 Alex Rav-Acha1 Dani Lischinski1
1School
of CS&Eng
The Hebrew University
2CSAIL
MIT
Hard segmentation and matting
Hard segmentation
compositing
Source image
matte
compositing
Previous approaches to segmentation and matting
Input
Hard output
Matte output
Previous approaches to segmentation and matting
Input
Hard output
Unsupervised
Spectral segmentation:
Shi and Malik 97
Yu and Shi 03
Weiss 99
Ng et al 01
Zelnik and Perona 05
Tolliver and Miller 06
Matte output
Previous approaches to segmentation and matting
Input
Hard output
Unsupervised
Supervised
 0
 1
July and Boykov01
Rother et al 04
Li et al 04
Matte output
Previous approaches to segmentation and matting
Input
Hard output
Matte output
Unsupervised
Supervised
 0
 1
Trimap interface: Bayesian Matting (Chuang et al 01)
Poisson Matting (Sun et al 04)
Random Walk (Grady et al 05)
Scribbles interface: Wang&Cohen 05
Levin et al 06
Easy matting (Guan et al 06)
Previous approaches to segmentation and matting
Input
Hard output
Matte output
Unsupervised
?
Supervised
 0
 1
Unsupervised matting
Input
Automatically computed hard
segments (Yu and Shi 03)
1
2
3
4
5
6
7
8
Automatically computed matting components
Using components
Building foreground object by simple components addition
+
+
=
Generalized compositing equation
2 layers compositing
=

I i  iFi  (1  i ) Bi
x
L1
+
1
x
L2
Generalized compositing equation
2 layers compositing
=
x

K layers compositing
=
+
I i  iFi  (1  i ) Bi

1

3
L1
+
1
x
L2
Ii   1i L1i   i2 L2i  ...  iK LKi
x
x
L1
3
L
+
+

2

4
Matting components
x
x
L2
L4
Generalized compositing equation
K layers compositing
=
+


1
3
Ii   L   i L  ...  i L
1 1
i
i
x
x
L1
3
L
2
+
+


2
i
2
4
Matting components:
0  ik  1
 1i   i2  ...  iK  1
“Sparse” layers- 0/1 for most image pixels
K
x
x
L2
L4
K
i
Goals:
• Automatically extract matting components from an image
• Derive analogy between hard spectral segmentation and
matting, and use similar tools.
• Use matting components to automate matte extraction
process and suggest new modes of user interaction
Spectral segmentation
Spectral segmentation: Analyzing smallest eigenvectors of
a graph Laplacian L
L  D W
D (i, i )   j W (i, j )
W (i, j )  e
E.g.:
Shi and Malik 97
Yu and Shi 03
Weiss 99
Ng et al 01
Maila and shi 01
Zelnik and Perona 05
Tolliver and Miller 06
 Ci  C j
2
/ 2
Spectral segmentation
Fully separated classes: class indicator vectors belong to Laplacian
nullspace
General case: class indicators approximated as linear combinations of
smallest eigenvectors


 Null 


Binary
indicating
vectors





Laplacian
matrix
Spectral segmentation
Fully separated classes: class indicator vectors belong to Laplacian
nullspace
General case: class indicators approximated as linear combinations of
smallest eigenvectors
Smallest eigenvectors- class indicators only up to linear transformation

Zero 
eigenvectors 







Laplacian
matrix

Smallest
eigenvectors
R33
Binary
indicating
vectors
Linear
transformation
The matting Laplacian (Levin, Lischinski and Weiss CVPR06)

J ( )   L
T
• L semidefinite sparse matrix
• L(i,
j)
local function of the image:

L(i, j )  k |( i , j )w  1  (Ci   k )T ( k  I 3 ) 1 (C j   k )
k

The matting Laplacian and user constrains
Levin et al CVPR06Input: Image+ user scribbles
  arg min L
T
s.t.  i  0,
 i  1,
i
i
The matting Laplacian and user constrains
Levin et al CVPR06Input: Image+ user scribbles
  arg min L
T
s.t.  i  0,
 i  1,
i
i
Our goal:
• Matting components from matting Laplacian- without user input
• Build on hard spectral segmentation ideas
Matting components and the matting Laplacian
Claim:
• For an image consisting of “well separated” layers, the matting
components belong to the matting Laplacian nullspace
• In the general case, matting components are reasonably
approximated as linear combinations of smallest eigenvectors


 Null 


Matting
components





Matting
Laplacian
From eigenvectors to matting components
linear
transformation
Hard segmentation- matting analogy
Traditional Laplacian
Matting Laplacian
Smallest eigenvectors
Linear transformation
Binary class indicators
Continuous matting components
From eigenvectors to matting components
1) Initialization: projection of hard segments
Smallest eigenvectors
K-means
e
m
l
..
Ck
Projection into eigs space
..
2) Non linear optimization for sparse components
 k  EET mC
..
k
Components with the scribble interface
Components
Levin et al cvpr06
(our approach)
Wang&Cohen
05
Random Walk
Poisson
Components with the scribble interface
Components
Levin et al cvpr06
(our approach)
Wang&Cohen
05
Random Walk
Poisson
Direct component picking interface
Building foreground object by simple components addition
+
+
=
Limitations
Need to set number of components:
Too few - may not contain desired matte
Too many - complicates computation and user interaction
Cluttered images require a large number of components
Input
Ground truth matte
70 eigs approximation
Conclusions
• Derived analogy between hard spectral segmentation to image
matting
• Automatically extract matting components from eigenvectors
• Automate matte extraction process and suggest new modes of user
interaction
Ground truth data and code available online:
vision.huji.ac.il/SpectralMatting
+
+
=