220 likes | 341 Views
SigSag : Iterative Detection Through Soft Message Passing. Arash Saber Tehrani Alex Dimakis Mike Neely . University of Southern California. Model. We consider a multiple access problem N users, each has a packet for an access point (base station). Each user retransmits until ack.
E N D
SigSag:Iterative Detection Through Soft Message Passing Arash Saber Tehrani Alex Dimakis Mike Neely University of Southern California
Model • We consider a multiple access problem • N users, each has a packet for an access point (base station). • Each user retransmits until ack. • Flat fading where cth transmission of user i is affected by fading h(c)i. • Worst case: Assume they always collide. • All users transmit packets for N rounds.
Model • Example: • 2 users transmit their packets twice. 1 2 3 1 2 3 X 1 2 3 1 2 3 y 1 2 3 4 1 2 3 U1 U2
Model • Example: • 2 users transmit their packets twice. 1 1 2 2 3 3 4 1 1 2 2 3 3
Model • Example: • 2 users transmit their packets twice. 1 1 2 2 3 3 4 1 1 2 2 3 3
multiuser detection • BS gets a bunch of noisy linear equations in the unknown bits xi, yi . • If there was no noise, optimal detection is solving linear equations. • Now that there is noise would ideally like to compute the likelihood for each xi ={ ±1 }B,yi ={ ±1 }B • Integer least squares problem that is NP-hard in general. [Boutros & Caire],[Verdu],[Reynolds,Wang,Poor] and references therein
multiuser detection for wifi • 802.11 BS gets noisy linear equations formed by repetition only. • If we are at high SNR we can make hard decisions for the bits (ignore the noise) and solve linear equations. • Gaussian elimination= • Bring in triangular form + Back-substitution. • If the equations are in triangular form already, we only need back-substitution
ZigZag decoding Forward ZigZag is back-substitution. [Gollakota, Katabi], [ Zhang, Shin],[Erran, Liu, Tan, Viswanathan, Yang]
Shortcomings of ZigZag • ZigZag can fail to decode when back-substitution is not possible. • Noise is accumulated as the ZigZag decoder advances through the packet. • Previous hard decisions can push other bits to be incorrect. • How to solve the problems? One heuristic solution is the forward-backward Zigzag. • Here we view this as a graphical model (factor graph).
Optimal decoding as inference Given observations of u’s Obtain the most likely xi, yi
fun facts about inference in graphical models • Optimal graphical model inference is NP-hard. • Well-known approximation algorithm: belief propagation (aka message passing, sum-product / max-product ) • BP is optimal if the factor graph has no cycles. • Loopy BP typically is a very good approximation algorithm. [Pearl in AI, Gallager for LDPC codes in information theory]
fun facts about inference in graphical models • Observation: ZigZag decoding is BP • (special case that deals with noise by hard thresholding) • (also equivalent to Luby erasure decoding and the algorithm you use to solve Sudoku) • Heuristic approach to deal with error-accumulation, run ZigZag twice (forward-backward) and average results. • Here we propose a systematic way to keep track of soft information about the bits: SigSagdecoding • Natural generalization of ZigZag that maintains soft information (probability distributions).
U2 U1 1 1 1 SigSag = BP that maintains probabilities 2 2 1 3 3 2 4 4 2 • SigSag is belief propagation on the collision graph. 1 1 3 f1 2 2 3 3 3 f2
results: two users • Factor graph is cycle free for two users: • Theorem 1 for two users transmitting without permuting their bits, the resulting factor graph is cycle-free with probability at least 1-1/w • SigSag is performing maximum likelihood decoding whp for two users with jitter. (max-product minimizes block error probability Sum-product minimizes bit error probability)
results: multiple users Multiple users • Theorem 2The left-regular bipartite factor graph G resulting from N packets of length B sent by N users with symbol permutation is locally tree-like with high probability. • Specifically, where e = (v,c) is a randomly chosen edge of the graph, 2ℓ is a fixed depth, and s is a suitable constant that depends on ℓ, N, but not on B. • SigSag is near ML for large packets B.
Proof ideas Start from a variable and grow the random factor graph. Write a recursion for the probability of forming a loop at step ℓ (similar to LDPC high girth arguments, Richardson, Urbanke, Luby, Mitzenmacher, et al.)
importance of Jitter • Theorem 3if the users transmit packets consecutively without any jitter delay (W = 0), the matrix A is rank deficient (even with bit permutations) • Without jitter ML detector fails even if there is no noise. • Random Delay of one Symbol Suffices to make the matrix A full rank whp. (equivalent to a classic result on random matrices by Kolmos 1967)
Experimental Results N = 2 users packet length B = 100.
Experimental Results N = 3 users packet length B = 100.
conclusions • Showed how ZigZag decoding is an instance of Belief propagation • Used this connection to develop a new decoding algorithm that maintains probabilistic information about symbols. • Showed that SigSag is optimal for two users whp and near optimal for multiple users. • Empirical performance shows significant gains. • Performance gains increase for more users.