85
Probabilistic Inference Lecture 2 M. Pawan Kumar [email protected] es available online http://cvc.centrale-ponts.fr/personnel/pa

Probabilistic Inference Lecture 2 M. Pawan Kumar [email protected] Slides available online

Embed Size (px)

Citation preview

Page 1: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Probabilistic InferenceLecture 2

M. Pawan Kumar

[email protected]

Slides available online http://cvc.centrale-ponts.fr/personnel/pawan/

Page 2: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Pose Estimation

Courtesy Pedro Felzenszwalb

Page 3: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Pose Estimation

Courtesy Pedro Felzenszwalb

Page 4: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Pose Estimation

Variables are body parts Labels are positions

Page 5: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Pose Estimation

Unary potentials θa;i proportional to fraction of foreground pixels

Variables are body parts Labels are positions

Page 6: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Pose Estimation

Pairwise potentials θab;ik proportional to d2

Head

Torso

Joint location according to ‘head’ part

Joint location according to ‘torso’ partd

Page 7: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Pose Estimation

Pairwise potentials θab;ik proportional to d2

Head

Torso

dHead

Torso

d>

Page 8: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Outline• Problem Formulation

– Energy Function– Energy Minimization– Computing min-marginals

• Reparameterization

• Energy Minimization for Trees

• Loopy Belief Propagation

Page 9: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Function

Va Vb Vc Vd

Label l0

Label l1

Random Variables V = {Va, Vb, ….}

Labels L = {l0, l1, ….}

Labelling f: {a, b, …. } {0,1, …}

Page 10: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Function

Va Vb Vc Vd

Q(f) = ∑a a;f(a)

Unary Potential

2

5

4

2

6

3

3

7Label l0

Label l1

Easy to minimize

Neighbourhood

Page 11: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Function

Va Vb Vc Vd

E : (a,b) E iff Va and Vb are neighbours

E = { (a,b) , (b,c) , (c,d) }

2

5

4

2

6

3

3

7Label l0

Label l1

Page 12: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Function

Va Vb Vc Vd

+∑(a,b) ab;f(a)f(b)

Pairwise Potential

0

1 1

0

0

2

1

1

4 1

0

3

2

5

4

2

6

3

3

7Label l0

Label l1

Q(f) = ∑a a;f(a)

Page 13: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Function

Va Vb Vc Vd

0

1 1

0

0

2

1

1

4 1

0

3

Parameter

2

5

4

2

6

3

3

7Label l0

Label l1

+∑(a,b) ab;f(a)f(b)Q(f; ) = ∑a a;f(a)

Page 14: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Outline• Problem Formulation

– Energy Function– Energy Minimization– Computing min-marginals

• Reparameterization

• Energy Minimization for Trees

• Loopy Belief Propagation

Page 15: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Minimization

Va Vb Vc Vd

2

5

4

2

6

3

3

7

0

1 1

0

0

2

1

1

4 1

0

3

Q(f; ) = ∑a a;f(a) + ∑(a,b) ab;f(a)f(b)

Label l0

Label l1

Page 16: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Minimization

Va Vb Vc Vd

2

5

4

2

6

3

3

7

0

1 1

0

0

2

1

1

4 1

0

3

Q(f; ) = ∑a a;f(a) + ∑(a,b) ab;f(a)f(b)

2 + 1 + 2 + 1 + 3 + 1 + 3 = 13

Label l0

Label l1

Page 17: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Minimization

Va Vb Vc Vd

2

5

4

2

6

3

3

7

0

1 1

0

0

2

1

1

4 1

0

3

Q(f; ) = ∑a a;f(a) + ∑(a,b) ab;f(a)f(b)

Label l0

Label l1

Page 18: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Minimization

Va Vb Vc Vd

2

5

4

2

6

3

3

7

0

1 1

0

0

2

1

1

4 1

0

3

Q(f; ) = ∑a a;f(a) + ∑(a,b) ab;f(a)f(b)

5 + 1 + 4 + 0 + 6 + 4 + 7 = 27

Label l0

Label l1

Page 19: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Minimization

Va Vb Vc Vd

2

5

4

2

6

3

3

7

0

1 1

0

0

2

1

1

4 1

0

3

Q(f; ) = ∑a a;f(a) + ∑(a,b) ab;f(a)f(b)

f* = arg min Q(f; )

q* = min Q(f; ) = Q(f*; )

Label l0

Label l1

Page 20: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Energy Minimization

f(a) f(b) f(c) f(d) Q(f; )0 0 0 0 180 0 0 1 150 0 1 0 270 0 1 1 200 1 0 0 220 1 0 1 190 1 1 0 270 1 1 1 20

16 possible labellings

f(a) f(b) f(c) f(d) Q(f; )1 0 0 0 16

1 0 0 1 13

1 0 1 0 25

1 0 1 1 18

1 1 0 0 18

1 1 0 1 15

1 1 1 0 23

1 1 1 1 16

f* = {1, 0, 0, 1}q* = 13

Page 21: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Outline• Problem Formulation

– Energy Function– Energy Minimization– Computing min-marginals

• Reparameterization

• Energy Minimization for Trees

• Loopy Belief Propagation

Page 22: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Min-Marginals

Va Vb Vc Vd

2

5

4

2

6

3

3

7

0

1 1

0

0

2

1

1

4 1

0

3

f* = arg min Q(f; ) such that f(a) = i

Min-marginal qa;i

Label l0

Label l1

Page 23: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Min-Marginals16 possible labellings qa;0 = 15f(a) f(b) f(c) f(d) Q(f; )0 0 0 0 18

0 0 0 1 15

0 0 1 0 27

0 0 1 1 20

0 1 0 0 22

0 1 0 1 19

0 1 1 0 27

0 1 1 1 20

f(a) f(b) f(c) f(d) Q(f; )1 0 0 0 16

1 0 0 1 13

1 0 1 0 25

1 0 1 1 18

1 1 0 0 18

1 1 0 1 15

1 1 1 0 23

1 1 1 1 16

Page 24: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Min-Marginals16 possible labellings qa;1 = 13

f(a) f(b) f(c) f(d) Q(f; )1 0 0 0 16

1 0 0 1 13

1 0 1 0 25

1 0 1 1 18

1 1 0 0 18

1 1 0 1 15

1 1 1 0 23

1 1 1 1 16

f(a) f(b) f(c) f(d) Q(f; )0 0 0 0 18

0 0 0 1 15

0 0 1 0 27

0 0 1 1 20

0 1 0 0 22

0 1 0 1 19

0 1 1 0 27

0 1 1 1 20

Page 25: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Min-Marginals and MAP• Minimum min-marginal of any variable = energy of MAP labelling

minf Q(f; ) such that f(a) = i

qa;i mini

mini ( )

Va has to take one label

minf Q(f; )

Page 26: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Summary

Energy Minimization

f* = arg min Q(f; )

Q(f; ) = ∑a a;f(a) + ∑(a,b) ab;f(a)f(b)

Min-marginals

qa;i = min Q(f; ) s.t. f(a) = i

Energy Function

Page 27: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Outline• Problem Formulation

• Reparameterization

• Energy Minimization for Trees

• Loopy Belief Propagation

Page 28: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Reparameterization

Va Vb

2

5

4

2

0

1 1

0

f(a) f(b) Q(f; )

0 0 7

0 1 10

1 0 5

1 1 6

2 +

2 +

- 2

- 2

Add a constant to all a;i

Subtract that constant from all b;k

Page 29: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Reparameterization

f(a) f(b) Q(f; )

0 0 7 + 2 - 2

0 1 10 + 2 - 2

1 0 5 + 2 - 2

1 1 6 + 2 - 2

Add a constant to all a;i

Subtract that constant from all b;k

Q(f; ’) = Q(f; )

Va Vb

2

5

4

2

0

0

2 +

2 +

- 2

- 2

1 1

Page 30: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Reparameterization

Va Vb

2

5

4

2

0

1 1

0

f(a) f(b) Q(f; )

0 0 7

0 1 10

1 0 5

1 1 6

- 3 + 3

Add a constant to one b;k

Subtract that constant from ab;ik for all ‘i’

- 3

Page 31: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Reparameterization

Va Vb

2

5

4

2

0

1 1

0

f(a) f(b) Q(f; )

0 0 7

0 1 10 - 3 + 3

1 0 5

1 1 6 - 3 + 3

- 3 + 3

- 3

Q(f; ’) = Q(f; )

Add a constant to one b;k

Subtract that constant from ab;ik for all ‘i’

Page 32: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Reparameterization

Va Vb

2

5

4

2

3 1

0

1

2

Va Vb

2

5

4

2

3 1

1

0

1

- 2

- 2

- 2 + 2+ 1

+ 1

+ 1

- 1

Va Vb

2

5

4

2

3 1

2

1

0 - 4 + 4

- 4

- 4

’a;i = a;i ’b;k = b;k

’ab;ik = ab;ik

+ Mab;k

- Mab;k

+ Mba;i

- Mba;i Q(f; ’) = Q(f; )

Page 33: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Reparameterization

Q(f; ’) = Q(f; ), for all f

’ is a reparameterization of , iff

’b;k = b;k

’a;i = a;i

’ab;ik = ab;ik

+ Mab;k

- Mab;k

+ Mba;i

- Mba;i

Equivalently Kolmogorov, PAMI, 2006

Va Vb

2

5

4

2

0

0

2 +

2 +

- 2

- 2

1 1

Page 34: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

RecapEnergy Minimization

f* = arg min Q(f; )Q(f; ) = ∑a a;f(a) + ∑(a,b) ab;f(a)f(b)

Min-marginals

qa;i = min Q(f; ) s.t. f(a) = i

Q(f; ’) = Q(f; ), for all f ’ Reparameterization

Page 35: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Outline• Problem Formulation

• Reparameterization

• Energy Minimization for Trees

• Loopy Belief Propagation

Pearl, 1988

Page 36: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation

• Belief Propagation is exact for chains

• Some problems are easy

• Exact MAP for trees

• Clever Reparameterization

Page 37: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Two Variables

Va Vb

2

5 2

1

0

Va Vb

2

5

40

1

Choose the right constant ’b;k = qb;k

Add a constant to one b;k

Subtract that constant from ab;ik for all ‘i’

Page 38: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 2

1

0

Va Vb

2

5

40

1

Choose the right constant ’b;k = qb;k

a;0 + ab;00 = 5 + 0

a;1 + ab;10 = 2 + 1minMab;0 =

Two Variables

Page 39: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5

-2

-3

Va Vb

2

5

40

1

Choose the right constant ’b;k = qb;k

Two Variables

Page 40: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5

-2

-3

Va Vb

2

5

40

1

Choose the right constant ’b;k = qb;k

f(a) = 1

’b;0 = qb;0

Two Variables

Potentials along the red path add up to 0

Page 41: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5

-2

-3

Va Vb

2

5

40

1

Choose the right constant ’b;k = qb;k

a;0 + ab;01 = 5 + 1

a;1 + ab;11 = 2 + 0minMab;1 =

Two Variables

Page 42: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5

-2

-3

Va Vb

2

5

6-2

-1

Choose the right constant ’b;k = qb;k

f(a) = 1

’b;0 = qb;0

f(a) = 1

’b;1 = qb;1

Minimum of min-marginals = MAP estimate

Two Variables

Page 43: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5

-2

-3

Va Vb

2

5

6-2

-1

Choose the right constant ’b;k = qb;k

f(a) = 1

’b;0 = qb;0

f(a) = 1

’b;1 = qb;1

f*(b) = 0 f*(a) = 1

Two Variables

Page 44: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5

-2

-3

Va Vb

2

5

6-2

-1

Choose the right constant ’b;k = qb;k

f(a) = 1

’b;0 = qb;0

f(a) = 1

’b;1 = qb;1

We get all the min-marginals of Vb

Two Variables

Page 45: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

RecapWe only need to know two sets of equations

General form of Reparameterization

’a;i = a;i

’ab;ik = ab;ik

+ Mab;k

- Mab;k

+ Mba;i

- Mba;i

’b;k = b;k

Reparameterization of (a,b) in Belief Propagation

Mab;k = mini { a;i + ab;ik }

Mba;i = 0

Page 46: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Three Variables

Va Vb

2

5 2

1

0

Vc

4 60

1

0

1

3

2 3

Reparameterize the edge (a,b) as before

l0

l1

Page 47: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5-3

Vc

6 60

1

-2

3

Reparameterize the edge (a,b) as before

f(a) = 1

f(a) = 1

-2 -1 2 3

Three Variables

l0

l1

Page 48: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5-3

Vc

6 60

1

-2

3

Reparameterize the edge (a,b) as before

f(a) = 1

f(a) = 1

Potentials along the red path add up to 0

-2 -1 2 3

Three Variables

l0

l1

Page 49: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5-3

Vc

6 60

1

-2

3

Reparameterize the edge (b,c) as before

f(a) = 1

f(a) = 1

Potentials along the red path add up to 0

-2 -1 2 3

Three Variables

l0

l1

Page 50: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5-3

Vc

6 12-6

-5

-2

9

Reparameterize the edge (b,c) as before

f(a) = 1

f(a) = 1

Potentials along the red path add up to 0

f(b) = 1

f(b) = 0

-2 -1 -4 -3

Three Variables

l0

l1

Page 51: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5-3

Vc

6 12-6

-5

-2

9

Reparameterize the edge (b,c) as before

f(a) = 1

f(a) = 1

Potentials along the red path add up to 0

f(b) = 1

f(b) = 0

qc;0

qc;1-2 -1 -4 -3

Three Variables

l0

l1

Page 52: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5-3

Vc

6 12-6

-5

-2

9

f(a) = 1

f(a) = 1

f(b) = 1

f(b) = 0

qc;0

qc;1

f*(c) = 0 f*(b) = 0 f*(a) = 1

Generalizes to any length chain

-2 -1 -4 -3

Three Variables

l0

l1

Page 53: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5-3

Vc

6 12-6

-5

-2

9

f(a) = 1

f(a) = 1

f(b) = 1

f(b) = 0

qc;0

qc;1

f*(c) = 0 f*(b) = 0 f*(a) = 1

Dynamic Programming

-2 -1 -4 -3

Three Variables

l0

l1

Page 54: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Dynamic Programming

3 variables 2 variables + book-keeping

n variables (n-1) variables + book-keeping

Start from left, go to right

Reparameterize current edge (a,b)

Mab;k = mini { a;i + ab;ik }

’ab;ik = ab;ik+ Mab;k - Mab;k’b;k = b;k

Repeat

Page 55: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Dynamic Programming

Start from left, go to right

Reparameterize current edge (a,b)

Mab;k = mini { a;i + ab;ik }

Repeat

Messages Message Passing

Why stop at dynamic programming?

’ab;ik = ab;ik+ Mab;k - Mab;k’b;k = b;k

Page 56: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 5-3

Vc

6 12-6

-5

-2

9

Reparameterize the edge (c,b) as before

-2 -1 -4 -3

Three Variables

l0

l1

Page 57: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 9-3

Vc

11 12-11

-9

-2

9

Reparameterize the edge (c,b) as before

-2 -1 -9 -7

’b;i = qb;i

Three Variables

l0

l1

Page 58: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

2

5 9-3

Vc

11 12-11

-9

-2

9

Reparameterize the edge (b,a) as before

-2 -1 -9 -7

Three Variables

l0

l1

Page 59: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

9

11 9-9

Vc

11 12-11

-9

-9

9

Reparameterize the edge (b,a) as before

-9 -7 -9 -7

’a;i = qa;i

Three Variables

l0

l1

Page 60: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Va Vb

9

11 9-9

Vc

11 12-11

-9

-9

9

Forward Pass Backward Pass

-9 -7 -9 -7

All min-marginals are computed

Three Variables

l0

l1

Page 61: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Chains

Start from left, go to right

Reparameterize current edge (a,b)

Mab;k = mini { a;i + ab;ik }

’ab;ik = ab;ik+ Mab;k - Mab;k’b;k = b;k

Repeat till the end of the chain

Start from right, go to left

Repeat till the end of the chain

Page 62: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Chains

• A way of computing reparam constants

• Generalizes to chains of any length

• Forward Pass - Start to End• MAP estimate• Min-marginals of final variable

• Backward Pass - End to start• All other min-marginals

Page 63: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Computational Complexity

• Each constant takes O(|L|)

• Number of constants - O(|E||L|)

O(|E||L|2)

• Memory required ?

O(|E||L|)

Page 64: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Trees

Vb

Va

Forward Pass: Leaf Root

All min-marginals are computed

Backward Pass: Root Leaf

Vc

Vd Ve Vg Vh

Page 65: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Outline• Problem Formulation

• Reparameterization

• Energy Minimization for Trees

• Loopy Belief Propagation

Pearl, 1988; Murphy et al., 1999

Page 66: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Va Vb

Vd Vc

Where do we start? Arbitrarily

a;0

a;1

b;0

b;1

d;0

d;1

c;0

c;1

Reparameterize (a,b)

Page 67: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Va Vb

Vd Vc

a;0

a;1

’b;0

’b;1

d;0

d;1

c;0

c;1

Potentials along the red path add up to 0

Page 68: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Va Vb

Vd Vc

a;0

a;1

’b;0

’b;1

d;0

d;1

’c;0

’c;1

Potentials along the red path add up to 0

Page 69: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Va Vb

Vd Vc

a;0

a;1

’b;0

’b;1

’d;0

’d;1

’c;0

’c;1

Potentials along the red path add up to 0

Page 70: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Va Vb

Vd Vc

’a;0

’a;1

’b;0

’b;1

’d;0

’d;1

’c;0

’c;1

Potentials along the red path add up to 0

Page 71: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Va Vb

Vd Vc

’a;0

’a;1

’b;0

’b;1

’d;0

’d;1

’c;0

’c;1

Potentials along the red path add up to 0

- a;0

- a;1

Page 72: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Va Vb

Vd Vc

a;0

a;1

b;0

b;1

d;0

d;1

c;0

c;1

Any suggestions? Fix Va to label l0

Page 73: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Va Vb

Vd Vc

Any suggestions? Fix Va to label l0

a;0 b;0

b;1

d;0

d;1

c;0

c;1

Equivalent to a tree-structured problem

Page 74: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Va Vb

Vd Vc

a;1

b;0

b;1

d;0

d;1

c;0

c;1

Any suggestions? Fix Va to label l1

Equivalent to a tree-structured problem

Page 75: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation on Cycles

Choose the minimum energy solution

Va Vb

Vd Vc

a;0

a;1

b;0

b;1

d;0

d;1

c;0

c;1

This approach quickly becomes infeasible

Page 76: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Loopy Belief Propagation

V1 V2 V3

V4 V5 V6

V7 V8 V9

Keep reparameterizing edges in some order

Hope for convergence and a good solution

Page 77: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Belief Propagation

• Generalizes to any arbitrary random field

• Complexity per iteration ?

O(|E||L|2)

• Memory required ?

O(|E||L|)

Page 78: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Theoretical Properties of BP

Exact for Trees Pearl, 1988

What about any general random field?

Run BP. Assume it converges.

Page 79: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Theoretical Properties of BP

Exact for Trees Pearl, 1988

What about any general random field?

Choose variables in a tree. Change their labels.Value of energy does not decrease

Page 80: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Theoretical Properties of BP

Exact for Trees Pearl, 1988

What about any general random field?

Choose variables in a cycle. Change their labels.Value of energy does not decrease

Page 81: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Theoretical Properties of BP

Exact for Trees Pearl, 1988

What about any general random field?

For cycles, if BP converges then exact MAPWeiss and Freeman, 2001

Page 82: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Speed-Ups for Special Cases

ab;ik = 0, if i = k

= C, otherwise.

Mab;k = mini { a;i + ab;ik }

Felzenszwalb and Huttenlocher, 2004

Page 83: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Speed-Ups for Special Cases

ab;ik = wab|i-k|

Mab;k = mini { a;i + ab;ik }

Felzenszwalb and Huttenlocher, 2004

Page 84: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Speed-Ups for Special Cases

ab;ik = min{wab|i-k|, C}

Mab;k = mini { a;i + ab;ik }

Felzenszwalb and Huttenlocher, 2004

Page 85: Probabilistic Inference Lecture 2 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online

Speed-Ups for Special Cases

ab;ik = min{wab(i-k)2, C}

Mab;k = mini { a;i + ab;ik }

Felzenszwalb and Huttenlocher, 2004