Upload
ashley-reynolds
View
224
Download
1
Embed Size (px)
Citation preview
Belief Propagation on Markov Random Fields
Aggeliki Tsoli
3/1/2008 MLRG 2
Outline
Graphical Models
Markov Random Fields (MRFs)
Belief Propagation
3/1/2008 MLRG 3
Graphical Models
Diagrams Nodes: random variables Edges: statistical dependencies among random
variables
Advantages:1. Better visualization
conditional independence properties new models design
2. Factorization
3/1/2008 MLRG 4
Graphical Models types
Directed causal relationships e.g. Bayesian networks
Undirected no constraints imposed on causality of events
(“weak dependencies”) Markov Random Fields (MRFs)
3/1/2008 MLRG 5
Example MRF Application: Image Denoising
Question: How can we retrieve the original image given the noisy one?
Original image
(Binary)
Noisy image
e.g. 10% of noise
3/1/2008 MLRG 6
MRF formulation Nodes
For each pixel i, xi : latent variable (value in original image) yi : observed variable (value in noisy image) xi, yi {0,1}
x1 x2
xi
xn
y1 y2
yi
yn
3/1/2008 MLRG 7
MRF formulation Edges
xi,yi of each pixel i correlated local evidence function (xi,yi) E.g. (xi,yi) = 0.9 (if xi = yi) and (xi,yi) = 0.1 otherwise (10% noise)
Neighboring pixels, similar value compatibility function (xi, xj)
x1 x2
xi
xn
y1 y2
yi
yn
3/1/2008 MLRG 8
MRF formulation
Question: What are the marginal distributions for xi, i = 1, …,n?
x1 x2
xi
xn
y1 y2
yi
yn
P(x1, x2, …, xn) = (1/Z) (ij) (xi, xj) i (xi, yi)
3/1/2008 MLRG 9
Belief Propagation
Goal: compute marginals of the latent nodes of underlying graphical model
Attributes: iterative algorithm message passing between neighboring latent variables
nodes
Question: Can it also be applied to directed graphs? Answer: Yes, but here we will apply it to MRFs
3/1/2008 MLRG 10
1) Select random neighboring latent nodes xi, xj
2) Send message mij from xi to xj
3) Update belief about marginal distribution at node xj 4) Go to step 1, until convergence
1) How is convergence defined?
Belief Propagation Algorithm
xi xj
yi yj
mij
3/1/2008 MLRG 11
Message mij from xi to xj : what node xi thinks about the marginal distribution of xj
Step 2: Message Passing
xi xj
yi yj
N(i)\j
mij(xj) = (xi) (xi, yi) (xi, xj) kN(i)\j mki(xi)
Messages initially uniformly distributed
3/1/2008 MLRG 12
Step 3: Belief Update
xj
yj
N(j)
b(xj) = k (xj, yj) qN(j) mqj(xj)
Belief b(xj): what node xj thinks its marginal distribution is
3/1/2008 MLRG 13
1) Select random neighboring latent nodes xi, xj
2) Send message mij from xi to xj
3) Update belief about marginal distribution at node xj 4) Go to step 1, until convergence
Belief Propagation Algorithm
xi xj
yi yj
mij
3/1/2008 MLRG 14
Example
2
- Compute belief at node 1.
1
3
4
m21
m32
m42
QuickTime™ and aTIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and aTIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and aTIFF (LZW) decompressor
are needed to see this picture.
Fig. 12 (Yedidia et al.)
3/1/2008 MLRG 15
Does graph topology matter?
BP procedure the same!
Performance Failure to converge/predict accurate beliefs [Murphy,
Weiss, Jordan 1999]
Success at decoding for error-correcting codes [Frey and Mackay
1998] computer vision problems where underlying MRF full of
loops [Freeman, Pasztor, Carmichael 2000]
vs.
3/1/2008 MLRG 16
How long does it take?
No explicit reference on paper My opinion, depends on
nodes of graph graph topology
Work on improving the running time of BP (for specific applications) Next time?
3/1/2008 MLRG 17
Questions?