810 likes | 916 Views
Probabilistic Inference Lecture 5. M. Pawan Kumar pawan.kumar@ecp.fr. Slides available online http:// cvc.centrale-ponts.fr /personnel/ pawan /. What to Expect in the Final Exam. Open Book Textbooks Research Papers Course Slides No Electronic Devices Easy Questions – 10 points
E N D
Probabilistic InferenceLecture 5 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online http://cvc.centrale-ponts.fr/personnel/pawan/
What to Expect in the Final Exam • Open Book • Textbooks • Research Papers • Course Slides • No Electronic Devices • Easy Questions – 10 points • Hard Questions – 10 points
Easy Question – BP Compute the reparameterization constants for (a,b) and (c,b) such that the unary potentials of b are equal to its min-marginals. -2 2 6 -6 12 -3 -2 -1 -4 5 -3 -5 9 5 Vb Vc Va
Hard Question – BP Provide an O(h) algorithm to compute the reparameterization constants of BP for an edge whose pairwise potentials are specified by a truncated linear model.
Easy Question – Minimum Cut Provide the graph corresponding to the MAP estimation problem in the following MRF. -2 2 6 -6 12 -3 -2 -1 -4 5 -3 -5 9 5 Vb Vc Va
Hard Question – Minimum Cut Show that the expansion algorithm provides a bound of 2M for the truncated linear metric, where M is the value of the truncation.
Easy Question – Relaxations Using an example, show that the LP-S relaxation is not tight for a frustrated cycle (cycle with an odd number of supermodular pairwise potentials).
Hard Question – Relaxations Prove or disprove that the LP-S and SOCP-MS relaxations are invariant to reparameterization.
Integer Programming Formulation min ∑a ∑i a;i ya;i + ∑(a,b) ∑ik ab;ik yab;ik ya;i {0,1} ∑i ya;i = 1 yab;ik =ya;i yb;k
Integer Programming Formulation min Ty ya;i {0,1} ∑i ya;i = 1 yab;ik =ya;i yb;k = [ … a;i …. ; … ab;ik ….] y = [ … ya;i …. ; … yab;ik ….]
Linear Programming Relaxation min Ty ya;i {0,1} ∑i ya;i = 1 yab;ik =ya;i yb;k Two reasons why we can’t solve this
Linear Programming Relaxation min Ty ya;i [0,1] ∑i ya;i = 1 yab;ik =ya;i yb;k One reason why we can’t solve this
Linear Programming Relaxation min Ty ya;i [0,1] ∑i ya;i = 1 ∑k yab;ik =∑kya;i yb;k One reason why we can’t solve this
Linear Programming Relaxation min Ty ya;i [0,1] ∑i ya;i = 1 ∑k yab;ik =ya;i∑k yb;k = 1 One reason why we can’t solve this
Linear Programming Relaxation min Ty ya;i [0,1] ∑i ya;i = 1 ∑k yab;ik =ya;i One reason why we can’t solve this
Linear Programming Relaxation min Ty ya;i [0,1] ∑i ya;i = 1 ∑k yab;ik =ya;i No reason why we can’t solve this * *memory requirements, time complexity
Dual of the LP Relaxation Wainwright et al., 2001 1 Va Vb Vc Va Vb Vc 2 Vd Ve Vf Vd Ve Vf 3 Vg Vh Vi Vg Vh Vi 4 5 6 Va Vb Vc Vd Ve Vf Vg Vh Vi i =
Dual of the LP Relaxation Wainwright et al., 2001 q*(1) Va Vb Vc Va Vb Vc Vd Ve Vf q*(2) Vd Ve Vf Vg Vh Vi q*(3) Vg Vh Vi q*(4) q*(5) q*(6) Va Vb Vc Dual of LP Vd Ve Vf max q*(i) Vg Vh Vi i =
Dual of the LP Relaxation Wainwright et al., 2001 q*(1) Va Vb Vc Va Vb Vc Vd Ve Vf q*(2) Vd Ve Vf Vg Vh Vi q*(3) Vg Vh Vi q*(4) q*(5) q*(6) Va Vb Vc Dual of LP Vd Ve Vf max q*(i) Vg Vh Vi i
Dual of the LP Relaxation Wainwright et al., 2001 max q*(i) i I can easily compute q*(i) I can easily maintain reparam constraint So can I easily solve the dual?
Outline • TRW Message Passing • Dual Decomposition
Things to Remember • BP is exact for trees • Every iteration provides a reparameterization • Forward-pass computes min-marginals of root
TRW Message Passing Kolmogorov, 2006 4 5 6 1 Va Vb Vc Vb Vc Va 2 Vd Ve Vf Ve Vf Vd 3 Vg Vh Vi Vh Vi Vg Va Pick a variable q*(i) i
TRW Message Passing Kolmogorov, 2006 1c;1 1b;1 1a;1 4a;1 4d;1 4g;1 1c;0 1b;0 1a;0 4a;0 4d;0 4g;0 Vc Vb Va Va Vd Vg q*(i) i
TRW Message Passing Kolmogorov, 2006 1c;1 1b;1 1a;1 4a;1 4d;1 4g;1 1c;0 1b;0 1a;0 4a;0 4d;0 4g;0 Vc Vb Va Va Vd Vg Reparameterize to obtain min-marginals of Va q*(1) + q*(4) + K 1 +4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’1a;1 ’4a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’1a;0 ’4a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg One pass of Belief Propagation q*(’1) + q*(’4) + K ’1 +’4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’1a;1 ’4a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’1a;0 ’4a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg Remain the same q*(’1) + q*(’4) + K ’1 +’4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’1a;1 ’4a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’1a;0 ’4a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg min{’1a;0,’1a;1} + min{’4a;0,’4a;1} + K ’1 +’4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’1a;1 ’4a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’1a;0 ’4a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg Compute average of min-marginals of Va min{’1a;0,’1a;1} + min{’4a;0,’4a;1} + K ’1 +’4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’1a;1 ’4a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’1a;0 ’4a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg ’’a;0 = ’1a;0+ ’4a;0 ’’a;1 = ’1a;1+ ’4a;1 2 2 min{’1a;0,’1a;1} + min{’4a;0,’4a;1} + K ’1 +’4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’’a;1 ’’a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’’a;0 ’’a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg ’’a;0 = ’1a;0+ ’4a;0 ’’a;1 = ’1a;1+ ’4a;1 2 2 min{’1a;0,’1a;1} + min{’4a;0,’4a;1} + K ’’1 +’’4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’’a;1 ’’a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’’a;0 ’’a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg ’’a;0 = ’1a;0+ ’4a;0 ’’a;1 = ’1a;1+ ’4a;1 2 2 min{’1a;0,’1a;1} + min{’4a;0,’4a;1} + K ’’1 +’’4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’’a;1 ’’a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’’a;0 ’’a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg ’’a;0 = ’1a;0+ ’4a;0 ’’a;1 = ’1a;1+ ’4a;1 2 2 2 min{’’a;0, ’’a;1} + K ’’1 +’’4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’’a;1 ’’a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’’a;0 ’’a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg ≥ min {p1+p2, q1+q2} min {p1, q1} + min {p2, q2} 2 min{’’a;0, ’’a;1} + K ’’1 +’’4 + rest
TRW Message Passing Kolmogorov, 2006 ’1c;1 ’1b;1 ’’a;1 ’’a;1 ’4d;1 ’4g;1 ’1c;0 ’1b;0 ’’a;0 ’’a;0 ’4d;0 ’4g;0 Vc Vb Va Va Vd Vg Objective function increases or remains constant 2 min{’’a;0, ’’a;1} + K ’’1 +’’4 + rest
TRW Message Passing Initialize i. Take care of reparam constraint Choose random variable Va Compute min-marginals of Va for all trees Node-average the min-marginals Can also do edge-averaging REPEAT Kolmogorov, 2006
Example 1 2 0 4 4 0 6 6 1 6 l1 1 2 4 1 3 1 l0 5 0 2 1 3 0 2 3 4 Vb Vc Va Va Vb Vc 5 6 7 Pick variable Va. Reparameterize.
Example 1 5 -3 4 4 0 6 6 -3 10 l1 2 1 -1 3 -3 -2 l0 7 -2 2 1 3 -3 2 3 7 Vb Vc Va Va Vb Vc 5 6 7 Average the min-marginals of Va
Example 1 7.5 -3 4 4 0 6 6 -3 7.5 l1 2 1 -1 3 -3 -2 l0 7 -2 2 1 3 -3 2 3 7 Vb Vc Va Va Vb Vc 7 6 7 Pick variable Vb. Reparameterize.
Example 1 7.5 -7.5 8.5 9 -5 6 6 -3 7.5 l1 1 -5.5 -3 -1 -3 -7 l0 7 -7 6 -3 3 -3 7 3 7 Vb Vc Va Va Vb Vc 7 6 7 Average the min-marginals of Vb
Example 1 7.5 -7.5 8.75 8.75 -5 6 6 -3 7.5 l1 1 -5.5 -3 -1 -3 -7 l0 7 -7 6.5 -3 3 -3 6.5 3 7 Vb Vc Va Va Vb Vc 6.5 6.5 7 Value of dual does not increase
Example 1 7.5 -7.5 8.75 8.75 -5 6 6 -3 7.5 l1 1 -5.5 -3 -1 -3 -7 l0 7 -7 6.5 -3 3 -3 6.5 3 7 Vb Vc Va Va Vb Vc 6.5 6.5 7 Maybe it will increase for Vc NO
Example 1 7.5 -7.5 8.75 8.75 -5 6 6 -3 7.5 l1 1 -5.5 -3 -1 -3 -7 l0 7 -7 6.5 -3 3 -3 6.5 3 7 Vb Vc Va Va Vb Vc f1(a) = 0 f1(b) = 0 f2(b) = 0 f2(c) = 0 f3(c) = 0 f3(a) = 0 Strong Tree Agreement Exact MAP Estimate
Example 2 2 0 2 0 1 0 0 0 4 l1 1 0 1 1 0 1 l0 5 0 0 1 3 0 2 0 8 Vb Vc Va Va Vb Vc 4 0 4 Pick variable Va. Reparameterize.
Example 2 4 -2 2 0 1 0 0 0 4 l1 0 -1 0 1 -1 0 l0 7 -2 0 1 3 -1 2 0 9 Vb Vc Va Va Vb Vc 4 0 4 Average the min-marginals of Va
Example 2 4 -2 2 0 1 0 0 0 4 l1 0 -1 0 1 -1 0 l0 8 -2 0 1 3 -1 2 0 8 Vb Vc Va Va Vb Vc 4 0 4 Value of dual does not increase
Example 2 4 -2 2 0 1 0 0 0 4 l1 0 -1 0 1 -1 0 l0 8 -2 0 1 3 -1 2 0 8 Vb Vc Va Va Vb Vc 4 0 4 Maybe it will decrease for Vb or Vc NO
Example 2 4 -2 2 0 1 0 0 0 4 l1 0 -1 0 1 -1 0 l0 8 -2 0 1 3 -1 2 0 8 Vb Vc Va Va Vb Vc f1(a) = 1 f1(b) = 1 f2(b) = 1 f2(c) = 0 f3(c) = 1 f3(a) = 1 f2(b) = 0 f2(c) = 1 Weak Tree Agreement Not Exact MAP Estimate
Example 2 4 -2 2 0 1 0 0 0 4 l1 0 -1 0 1 -1 0 l0 8 -2 0 1 3 -1 2 0 8 Vb Vc Va Va Vb Vc f1(a) = 1 f1(b) = 1 f2(b) = 1 f2(c) = 0 f3(c) = 1 f3(a) = 1 f2(b) = 0 f2(c) = 1 Weak Tree Agreement Convergence point of TRW