Belief Propagation

Download Report

Transcript Belief Propagation

Belief Propagation on
Markov Random Fields
Aggeliki Tsoli
Outline

Graphical Models

Markov Random Fields (MRFs)

Belief Propagation
3/1/2008
MLRG
2
Graphical Models
Diagrams



Nodes: random variables
Edges: statistical dependencies among random
variables
Advantages:

1.
Better visualization


2.
3/1/2008
conditional independence properties
new models design
Factorization
MLRG
3
Graphical Models types

Directed



causal relationships
e.g. Bayesian networks
Undirected


3/1/2008
no constraints imposed on causality of events
(“weak dependencies”)
Markov Random Fields (MRFs)
MLRG
4
Example MRF Application:
Image Denoising
Noisy image
Original image
e.g. 10% of noise
(Binary)

Question: How can we retrieve the original image
given the noisy one?
3/1/2008
MLRG
5
MRF formulation

Nodes

For each pixel i,



xi : latent variable (value in original image)
yi : observed variable (value in noisy image)
xi, yi  {0,1}
y1
x1
y2
x2
yi
xi
yn
xn
3/1/2008
MLRG
6
MRF formulation

Edges

xi,yi of each pixel i correlated



local evidence function (xi,yi)
E.g. (xi,yi) = 0.9 (if xi = yi) and (xi,yi) = 0.1 otherwise (10%
noise)
Neighboring pixels, similar value

compatibility function (xi, xj)
y1
x1
y2
x2
yi
xi
yn
xn
3/1/2008
MLRG
7
MRF formulation
y1
x1
y2
x2
yi
xi
yn
xn
P(x1, x2, …, xn) = (1/Z) (ij) (xi, xj) i (xi, yi)

Question: What are the marginal distributions for xi, i = 1,
…,n?
3/1/2008
MLRG
8
Belief Propagation

Goal: compute marginals of the latent nodes of
underlying graphical model

Attributes:




iterative algorithm
message passing between neighboring latent variables
nodes
Question: Can it also be applied to directed graphs?
Answer: Yes, but here we will apply it to MRFs
3/1/2008
MLRG
9
Belief Propagation Algorithm
1) Select random neighboring latent nodes xi, xj
2) Send message mij from xi to xj
yi
xi
mij
yj
xj
3) Update belief about marginal distribution at node xj
4) Go to step 1, until convergence
•
3/1/2008
How is convergence defined?
MLRG
10
Step 2: Message Passing

Message mij from xi to xj : what node xi thinks
about the marginal distribution of xj
yi
N(i)\j
xi
yj
xj
mij(xj) = (x ) (xi, yi) (xi, xj) kN(i)\j mki(xi)
i

Messages initially uniformly distributed
3/1/2008
MLRG
11
Step 3: Belief Update

Belief b(xj): what node xj thinks its marginal
distribution is
N(j)
yj
xj
b(xj) = k (xj, yj) qN(j) mqj(xj)
3/1/2008
MLRG
12
Belief Propagation Algorithm
1) Select random neighboring latent nodes xi, xj
2) Send message mij from xi to xj
yi
xi
mij
yj
xj
3) Update belief about marginal distribution at node xj
4) Go to step 1, until convergence
3/1/2008
MLRG
13
Example
- Compute belief at node 1.
m32
1
m21
3
Fig. 12 (Yedidia et al.)
2
m42
4
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
3/1/2008
MLRG
14
Does graph topology matter?

BP procedure the same!

Performance

Failure to converge/predict accurate beliefs [Murphy,
Weiss, Jordan 1999]
vs.

3/1/2008
Success at
 decoding for error-correcting codes [Frey and Mackay
1998]
 computer vision problems where underlying MRF full of
loops [Freeman, Pasztor, Carmichael 2000]
MLRG
15
How long does it take?


No explicit reference on paper
My opinion, depends on



nodes of graph
graph topology
Work on improving the running time of BP
(for specific applications)

3/1/2008
Next time?
MLRG
16
Questions?
3/1/2008
MLRG
17