310 likes | 617 Views
Binary Error Correcting Network Codes. Qiwen Wang, Sidharth Jaggi , Shuo -Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information Theory Workshop 2011, Paraty , Brazil. Outline. 1. Motivation. Model. 2. 3. Main Results.
E N D
Binary Error Correcting Network Codes Qiwen Wang, SidharthJaggi, Shuo-Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information Theory Workshop 2011, Paraty, Brazil The Chinese University of Hong Kong
Outline 1 Motivation Model 2 3 Main Results 4 Discussion 5 Conclusion The Chinese University of Hong Kong
Motivation: Challenges of NC over Noisy Networks 1 varying noise level (p-ε,p+ε) The Chinese University of Hong Kong
Motivation: Challenges of NC over Noisy Networks error 1 varying noise level errors propagate through mix-and-forward 2 The Chinese University of Hong Kong
Motivation: Challenges of NC over Noisy Networks [f2,4,f2,7] [f1,3,f1,5] 1 2 1 varying noise level [f3,6,f4,6] 3 4 errors propagate through mix-and-forward 5 2 7 6 [f6,8,f6,9] 9 8 coding kernels unknown a priori 3 The Chinese University of Hong Kong
Network Model & Code Model … … Alice Bob Mincut = C The Chinese University of Hong Kong
Network Model & Code Model n …… C ⁞ ⁞ Alice Bob Mincut = C The Chinese University of Hong Kong
Network Model & Code Model Cm x n Cm x n α ⁞ ⁞ Alice β Bob The Chinese University of Hong Kong
Finite Field & Binary Field I One Packet: n symbols over mn bits mx n binary matrix The Chinese University of Hong Kong
Finite Field & Binary Field II T m x m binary matrix a symbol over TS Multiplication over Multiplication over binary field The Chinese University of Hong Kong
Transfer Matrix Noiseless Network X Y … … Cm x Cm Cm x n Cm x n T X Y × The Chinese University of Hong Kong
Noise Model: Worst-case Bit-flip Error Link A Link B OR Link A Link B Errors can be arbitrarily distributed, with an upper bound of fraction p. Worst possible damage can happen to received packets. The Chinese University of Hong Kong
Noise Model: Worst-case Bit-flip Error Em x n Worst-case bit-flip error matrix Z: no more than pEmn1s, arbitrarily distributed E: num of edges in the network The Chinese University of Hong Kong
Noise Model: Worst-case Bit-flip Error Em x n Error bits on the 1st edge Worst-case bit-flip error matrix Z: no more than pEmn1s, arbitrarily distributed E: num of edges in the network Edge 1 The Chinese University of Hong Kong
Noise Model: Worst-case Bit-flip Error Em x n Error bits on the 1st edge Worst-case bit-flip error matrix Z: no more than pEmn1s, arbitrarily distributed E: num of edges in the network Edge 1 The Chinese University of Hong Kong
Impulse Response Matrix Z … X Y … … Em x n Z Cm x Cm Cm x n Cm x Em Cm x n T × X Y × The Chinese University of Hong Kong
Transform Metric Z T X Y × × Z 00101...0010 TX Y × The Chinese University of Hong Kong
Transform Metric Xi Yi + = di columns … The Chinese University of Hong Kong di is the minimum number of columns of that need to be added to TX(i) to obtain Y(i). Claim: is a distance metric.
Hamming-type Upper Bound Theorem 1 For all p less than C/(2Em), an upper bound on the achievable rate of any code over the worst-case binary-error network is The Chinese University of Hong Kong
Hamming-type Upper Bound Proof (sketch) pEmn • Total number of Cm x n binary matrices (volume of the big square) is . • Lower bound of the volume of the balls • Consider those Z’s where every column has pEm ones in it, distinct Z results in distinct . • The number of distinct is at least ~ • Upper bound on the size of any codebook is • Asymptotically in n, the Hamming-type upper bound is • . pEmn pEmn The Chinese University of Hong Kong
Coherent/Non-coherent NC Coherent NC: receiver knows the internal coding coefficients, hence knows T and . However, the random linear coding coefficients are usually chosen on the fly. Non-coherent NC: coding coefficients, hence T and , are unknown in advance, more realistic setting. The Chinese University of Hong Kong
Coherent/Non-coherent NC Coherent NC: receiver knows the internal coding coefficients, hence knows T and . However, the random linear coding coefficients are usually chosen on the fly. Non-coherent NC: coding coefficients, hence T and , are unknown in advance, more realistic setting. The Chinese University of Hong Kong
Coherent/Non-coherent NC Coherent NC: receiver knows the internal coding coefficients, hence knows T and . However, the random linear coding coefficients are usually chosen on the fly. Non-coherent NC: coding coefficients, hence T and , are unknown in advance, more realistic setting. The Chinese University of Hong Kong
Coherent/Non-coherent NC Coherent NC: receiver knows the internal coding coefficients, hence knows T and . However, the random linear coding coefficients are usually chosen on the fly. Non-coherent NC: coding coefficients, hence T and , are unknown in advance, more realistic setting. The Chinese University of Hong Kong
GV-type Lower Bound Theorem 2 Coherent GV-type network codes achieve a rate of at least Theorem 3 Non-coherent GV-type network codes achieve a rate of at least The Chinese University of Hong Kong
GV-type Lower Bound Proof of Thm2 (sketch) TX(1) TX(2) 2pEmn 2pEmn • Need an upper bound on volume of instead of the lower bound on volume of as in Thm1. (sphere packing vs. covering) • Different Y, or equivalently , can be bounded above by the number of different Z, which equals • The summation can be bounded from above by • ~ • Lower bound on the size of the codebook • Asymptotically in n, the rate of coherent GV-type NC • . TX(3) 2pEmn The Chinese University of Hong Kong
GV-type Lower Bound Proof of Thm3 (sketch) • Crucial differencewith the proof of Thm2: the process of choosing codewords. • Consider all possible values of , at most (and hence T, since it comprises of a specific subset of C columns of ). • The number of potential codewords that can be chosen in the codebook is at least • which equals • Asymptotically in n, it leads to the same rate of as coherent NC in Theorem2. The Chinese University of Hong Kong
Scale of Parameters Claim For all p less than ,the Hamming-type and GV-type bounds hold. Proof • Theorem 1 (Hamming-type upper bound)requires that . • For the GV-type bound in Thm2 and Thm3 to give non-negative rates, . When p is small, The Chinese University of Hong Kong
Coherent/non-coherent GV-type lower bound: Worst-case bit-flip error model GV-type codes: End-to-end nature Complexity: poly. in block length Hamming-type upper bound: Conclusion The Chinese University of Hong Kong
Future Direction • Efficient coding schemes • Other binary noise model • Combine link-by-link codes • with our end-to-end codes The Chinese University of Hong Kong
Thank you! The Chinese University of Hong Kong