150 likes | 275 Views
Inference. Probabilistic Graphical Models. Message Passing. Loopy BP and Message Decoding. Message Coding & Decoding. U 1 , …, U k. Encoder. X 1 , …, X n. Rate = k/n. Noisy Channel. Bit error rate =. V 1 , …, V k. Decoder. Y 1 , …, Y n. Channel Capacity. Noisy Channel. 0.9.
E N D
Inference Probabilistic Graphical Models Message Passing Loopy BP and Message Decoding
Message Coding & Decoding U1, …, Uk Encoder X1, …, Xn Rate = k/n Noisy Channel Bit error rate = V1, …, Vk Decoder Y1, …, Yn
Channel Capacity Noisy Channel 0.9 0.9 0 0 0 0 0.1 0.1 Binary symmetric channel Binary erasure channel ? 0.1 0.1 1 1 1 1 0.9 0.9 capacity = 0.531 capacity = 0.9 capacity = 0 X
McEliece Shannon’s Theorem Bit error probability Rate in multiples of capacity
McEliece How close to C can we get? -1 -2 -3 Log bit error prob -4 -5 -6 Signal to noise ratio (dB)
McEliece How close to C can we get? -1 -2 -3 Log bit error prob. -4 -5 -6 -7 Signal to noise ratio (dB) Shannon limit = -0.79
Turbocodes: The Idea Encoder 1 Y11, …, Y1n Noisy Channel U1, …, Uk Y21, …, Y2n Encoder 2 Compute P(U | Y1, Y2) Decoder 1 Y11, …, Y1n Bel(U | Y1) Bel(U | Y2) Y21, …, Y2n Decoder 2
Iterations of Turbo Decoding -1 -2 -3 Log bit error prob. -4 -5 -6 Signal to noise ratio (dB)
Low-Density Parity Checking Codes Y1 Y2 Y3 Y4 X1 X2 X3 X4 original message bits Invented by Robert Gallager in 1962! U1 U2 U3 U4 X5 X6 X7 parity bits Y5 Y6 Y7
X5 cluster X7 cluster X6 cluster U1 U2 U3 U4 Decoding as Loopy BP Y1 Y2 Y3 Y4 X1 X2 X3 X4 U1 U2 U3 U4 X5 X6 X7 Y5 Y6 Y7
Decoding in Action Information Theory, Inference, and Learning Algorithms, David MacKay
Turbo-Codes & LDPCs • 3G and 4G mobile telephony standards • Mobile television system from Qualcomm • Digital video broadcasting • Satellite communication systems • New NASA missions (e.g., Mars Orbiter) • Wireless metropolitan network standard
Summary • Loopy BP rediscovered by coding practitioners • Understanding turbocodes as loopy BP led to development of many new and better codes • Current codes coming closer and closer to Shannon limit • Resurgence of interest in BP led to much deeper understanding of approximate inference in graphical models • Many new algorithms