550 likes | 704 Views
List-Decoding Reed-Muller Codes over Small Fields. Parikshit Gopalan Microsoft Adam R. Klivans UT Austin David Zuckerman UT Austin. 1. 0. 0. 1. 1. 0. 0. 1. Error Correcting Codes. Communication over a Noisy Channel:. Adversary corrupts 10% of the bits.
E N D
List-Decoding Reed-Muller Codes over Small Fields Parikshit GopalanMicrosoft Adam R. KlivansUT AustinDavid ZuckermanUT Austin 1 0 0 1 1 0 0 1
Error Correcting Codes Communication over a Noisy Channel: Adversary corrupts 10% of the bits. Problem:Recover the (entire) message. Soln:Introduce redundancy.
Error-Correcting Codes Deep-space communication Internet Cellphones Satellite Broadcast Audio CDs Bar-codes
Codes from Polynomials Encoding:Alice wants to send (a,b). Let L(x) = ax +b. Send L(1), L(2), …, L(7).
Codes from Polynomials Adversary: Corrupts two values. Decoding: Find the (unique) line that passes through 5 points.
Codes from Polynomials • Low-degree polynomials differ in many places. • Relative distance: Hamming distance/length • min distance:min {C,C’)|codewords C,C’}
Codes from Polynomials • Low-degree polynomials differ in many places. • Relative distance: Hamming distance/length • min distance:min {C,C’)|codewords C,C’} • Reed-Solomon codes: Univariate polynomials. • Reed-Muller codes: Multivariate polynomials.
Reed-Muller Codes [Muller’54, Reed’54] • Messages: Polynomials of degree r in m variables over {0,1}. • Q(X1,X2,X3) = X1X2 + X3 • Encoding: Truth table. • 01010110 • Minimum distance: = 2-r. • Hadamard codes:r =1. 1 0 0 1 1 0 0 1
Reed-Muller Codes [Muller’54, Reed’54] • Messages: Polynomials of degree r in m variables over {0,1}. • Q(X1,X2,X3) = X1X2 + X3 • Encoding: Truth table. • 01010110 • Minimum distance: = 2-r. • Hadamard codes:r =1. 1 0 0 1 1 0 0 1
Reed-Muller Codes [Muller’54, Reed’54] • Messages: Polynomials of degree r in m variables over {0,1}. • Q(X1,X2,X3) = X1X2 + X3 • Encoding: Truth table. • 01010110 • Minimum distance: = 2-r. • Hadamard codes:r =1. 0 1 0 1 0 0 1 1
Decoding ´ Polynomial Reconstruction 0 1 0 1 0 1 0 1 1 0 0 0 0 1 1 1 Problem:Given data points, find a low degree polynomial that fits best. Well studied problem, numerous applications.
The Decoding Problem C’ C • Received work R:{0,1}m {0,1}. • Unique Decoding: Find C such that (R,C) < /2. • List Decoding: [Elias’57, Wozencraft’58] • Find all C such that (R,C) < . • Few such C. • Johnson bound: List is small up to J() where • J() = (1-√(1-2))/2 = /2 + 2/2 + ... <
The Computational Model • Global Decoding: R 0 1 0 0 0 1 1 0 1 1 0 1 1 1 0 1 0 0 0 0 1 Given R as input. Run time poly in n = 2m. Local Decoding: Given an oracle for R. Run time poly in m = log n. x R(x) R
Decoding Reed-Muller codes • Unique Decoding: Majority Logic Decoder. [Reed’54] • Local List Decoding: Hadamard codes (r = 1).[Goldreich-Levin’89] Alternate algorithms: [ Levin, Rackoff, Kushilevitz-Mansour, …] No algorithms known forr ¸ 2. Good algorithms for large fields (r < |F|).[Goldreich-Rubinfeld-Sudan, Arora-Sudan, Sudan-Trevisan-Vadhan]
Our Results • Main Result:Local List-Decoding RM codes forr ¸ 2. Works up to Minimum Distance 2-r - . • Returns list of size -O(r) in time poly(mr, -r). • Improves Majority Logic Decoding for r ¸ 2. • Generalizes [Goldreich-Levin’89]. • Beats the Johnson bound. • For r =2, 0.146 versus 0.25. • List-size becomes exponential at 2-r.
Our Results • Global List-Decoding: • Deterministic algorithm for r ¸ 2. • Works up to distance J(2.2-r) - . • Beyond the minimum distance. • For r =2, ½ - versus ¼ - . • Returns list of size -O(m) in time poly(-O(m)). • Brute force needs time O(2mr). • New combinatorial bound.
Local List-Decoding • {0,1}mlabeled byreceived word R. • Fix codeword Q so that (Q,R) < - .
Local List-Decoding • {0,1}nlabeled byreceived word R. • Fix codeword Q so that (Q,R) < - . R(x) Q(x) R(x) = Q(x)
A Self-Corrector [Goldreich-Levin] b • Goal: Find Q(b) whp. • Pick a small subspace A randomly. • Assume we know Q on A.
A Self-Corrector [Goldreich-Levin] b • Goal: Find Q(b) whp. • Pick a small subspace A randomly. • Assume we know Q on A.
A Self-Corrector [Goldreich-Levin] b • Goal: Find Q(b) whp. • Pick Arandomly. • We know Q on A. • Error onb + A < (very likely).
A Self-Corrector [Goldreich-Levin] • Goal: Find Q(b) whp. • Pick Arandomly. • We know Q on A. • Error onb + A < (very likely). • Error on combined subspace< /2.
A Self-Corrector [Goldreich-Levin] • Goal: Find Q(b) whp. • Pick Arandomly. • We know Q on A. • Error onb + A < (very likely). • Error on combined subspace< /2. • Unique Decode!
Interpolating Sets • Qof degreer efficiently computable from Q(b), b B=B(r). • r=1:0, e1, e2,…, em. • General r: all b of weight r. • Pick one random A. Use A to self-correct all b in interpolating set B. • Union bound whp correct on all of B. • Can improve via Noisy Interpolating Sets[Dvir,Shpilka].
advice Self-Corrector Interpolator Overall Algorithm R:{0,1}m! {0,1}
Generating our own Advice • Advice: Q restricted to A. • A could have dimension log m. • Only m choices for r =1. • Too many choices when r ¸ 2. dim(A) = log(1/) = 1/poly(m)
Generating our own Advice • Advice: Q restricted to A. • A could have dimension k=log m. • Error on A is <, whp. • Decode on A in time poly(2k).
Global List-Decoding: Case r=2 Problem:GivenR: {0,1}k {0,1}, find allQof degree2so that(Q,R) < ¼. Run time polynomial in block-length 2k.
Global List-Decoding: Case r=2 • Problem:GivenR: {0,1}k {0,1}, find allQof degree2so that(Q,R) < . • l():Worst case list-size. • Algorithm runs in time poly(2k,l()). • Works for all. • Does not imply bounds on list-size.
Global List-Decoding: Case r=2 Problem:GivenR: {0,1}k {0,1}, find allQso that(Q,R) < . = ½(0 + 1). Let01. So0, 1 2. Q0 + L 1 Xk = 1 Q0 Q=Q0(X1,…,Xk-1) + XkL(X1,…,Xk-1) 0 Xk = 0 • RecoverQ0fromXk = 0. (degree2, error). • RecoverL fromXk = 1. (degree1, error2).
Global List-Decoding: Case r=2 Problem:GivenR: {0,1}k {0,1}, find allQso that(Q,R) < . = ½(0 + 1). Let01. Don’t know whether01. Try all possibilities. Overhead is 2k . Q0 + L 1 Xk = 1 Q0 0 Xk = 0
Bounds on List-Size Problem:GivenR: {0,1}k {0,1}, bound number of quadratic polys.Qs.t.(Q,R) < 1/4. Goal: Bound of2O(k). Johnson bound: 2O(k) for distance J(¼) = 0.156. Can we improve the distance ofRM(2,k) ?
Analogy: Inter-Star Distance Proxima Centauri:4.2 light-years.
Inter-Star Distance? Within 100,000 light-years µMilky Way.
Intergalactic Distance Andromeda: 2.5 million light years away.
Inter-Star Distance? Local Group of Galaxies, Local Supercluster, …
Bounds on List-Size Problem:GivenR: {0,1}k {0,1}, bound number of quadratic polys.Qs.t.(Q,R) < 1/4. Goal: Bound of2O(k). Johnson bound: 2O(k) for distance J(¼) = 0.156. Can we improve the distance ofRM(2,k) ? Yes, for a 2-O(k)-dense subset of RM(2,k). Thm:Every quadratic form can be written asQ = L1L2 + …L2t-1L2t + L0 where Lis are LI and1 · t · k/2. Rank of Q
Rank versus Weight Rank 2 forms. Weight 0.375. J(0.375) = ¼. Rank 1 forms. Only 22k. Thm: List-size is 2O(k) at distance ¼.
Bounding the List-size R Each remaining pair at dist.0.375. List-size2kby Johnson bound.
Bounding the List-Size. • 2kballs by Johnson bound. • Each ball contains at most22kcodewords. • Overall at most 23kcodewords at radius ¼. We need k = O(log m) for local decoding.
advice Self-Corrector Interpolator Overall Local List-Decoder R:{0,1}m! {0,1}
Self-Corrector Interpolator Overall Local List-Decoder R:{0,1}m! {0,1} Global List-Decoder
Extension to Higher Degree • No analogue of rank. • [Kasami-Tokura]: Characterizes codewords with weight · 21-r. • List-decoding up to radius2-r - inpoly(m, -1).