1 / 44

The Factor Graph Approach to Model-Based Signal Processing

The Factor Graph Approach to Model-Based Signal Processing. Hans-Andrea Loeliger. Outline. Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians Conclusion. Outline. Introduction Factor graphs Gaussian message passing in linear models Beyond Gaussians

mick
Download Presentation

The Factor Graph Approach to Model-Based Signal Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Factor Graph Approach to Model-Based Signal Processing Hans-Andrea Loeliger

  2. Outline • Introduction • Factor graphs • Gaussian message passing in linear models • Beyond Gaussians • Conclusion

  3. Outline • Introduction • Factor graphs • Gaussian message passing in linear models • Beyond Gaussians • Conclusion

  4. Introduction • Engineers like graphical notation • It allow to compose a wealth of nontrivial algorithms from tabulated “local” computational primitive

  5. Outline • Introduction • Factor graphs • Gaussian message passing in linear models • Beyond Gaussians • Conclusion

  6. Factor Graphs • A factor graph represents the factorization of a function of several variables • Using Forney-style factor graphs

  7. Factor Graphscont’d • Example:

  8. Factor Graphscont’d • Forney-style factor graph (FFG); (b) factor graph as in [3]; • (c) Bayesian network; (d) Markov random field (MRF)

  9. Factor Graphscont’d • Advantages of FFGs: • suited for hierarchical modeling • compatible with standard block diagram • simplest formulation of the summary-product message update rule • natural setting for Forney’s result on FT and duality

  10. Auxiliary Variables • Let Y1 and Y2 be two independent observations of X:

  11. Modularity and Special Symbols • Let and with Z1, Z2 and X independent • The “+”-nodes represent the factors and

  12. Outline • Introduction • Factor graphs • Gaussian message passing in linear models • Beyond Gaussians • Conclusion

  13. Computing Marginals • Assume we wish to compute • For example, assume that can be written as

  14. Computing Marginalscont’d

  15. Message Passing Viewcont’d

  16. Sum-Product Rule • The message out of some node/factor along some edge is formed as the product of and all incoming messages along all edges except , summed over all involved variables except

  17. Arrows and Notation for Messages • denotes the message in the direction of the arrow • denotes the message in the opposite direction

  18. Marginals and Output Edges

  19. Max-Product Rule • The message out of some node/factor along some edge is formed as the product of and all incoming messages along all edges except , maximized over all involved variables except

  20. Scalar Gaussian Message • Message of the form: • Arrow notation: / is parameterized by mean / and variance /

  21. Scalar Gaussian Computation Rules

  22. Vector Gaussian Messages • Message of the form: • Message is parameterized • either by mean vector m and covariance matrix V=W-1 • or by W and Wm

  23. Vector Gaussian Messagescont’d • Arrow notation: is parameterized by and or by and • Marginal: is the Gaussian with mean and covariance matrix

  24. Single Edge Quantities

  25. Elementary Nodes

  26. Matrix Multiplication Node

  27. Composite Blocks

  28. Reversing a Matrix Multiplication

  29. Combinations

  30. General Linear State Space Model

  31. General Linear State Space Model Cont’d • If is nonsingular and -forward and -backward • If is singular and -forward and -backward

  32. General Linear State Space Model Cont’d • By combining the forward version with backward version, we can get

  33. Gaussian to Binary

  34. Outline • Introduction • Factor graphs • Gaussian message passing in linear models • Beyond Gaussians • Conclusion

  35. Message Types • A key issue with all message passing algorithms is the representation of messages for continuous variables • The following message types are widely applicable • Quantization of continuous variables • Function value and gradient • List of samples

  36. Message Typescont’d • All these message types, and many different message computation rules, can coexist in large system models • SD and EM are two example of message computation rules beyond the sum-product and max-product rules

  37. LSSM with Unknown Vector C

  38. Steep Descent as Message Passing • Suppose we wish to find

  39. Steep Descent as Message Passing Cont’d • Steepest descent: where s is a positive step-size parameter

  40. Steep Descent as Message Passing Cont’d • Gradient messages:

  41. Steep Descent as Message Passing Cont’d

  42. Outline • Introduction • Factor graphs • Gaussian message passing in linear models • Beyond Gaussians • Conclusion

  43. Conclusion • The factor graph approach to signal processing involves the following steps: • Choose a factor graph to represent the system model • Choose the message types and suitable message computation rules • Choose a message update schedules

  44. Reference [1] H.-A. Loeliger, et al., “The factor graph approach to model-based signal processing” [2] H.-A. Loeliger, “An introduction to factor graphs,” IEEE Signal Proc. Mag., Jan. 2004, pp.28-41 [3] F.R. Kschischang, B.J. Fery, and H.-A. Loeliger, “Factor graphs and the sum-product algorithm,” IEEE Trans. Inform. Theory, vol. 47, pp.498-519

More Related