300 likes | 428 Views
Polynomials in Complexity Theory. Swastik Kopparty (Rutgers). Plan. Three amazing applications of polynomials Secrets + Secure multiparty computation Lower bounds on constant depth circuits Checking proofs Along the way Some morals. Polynomial refresher. Field F
E N D
Polynomials in Complexity Theory SwastikKopparty (Rutgers)
Plan • Three amazing applications of polynomials • Secrets + Secure multiparty computation • Lower bounds on constant depth circuits • Checking proofs • Along the way • Some morals
Polynomial refresher • Field F • This talk: F is finite • Univariate polynomials over F • Unique factorization of polynomials • P() = 0 iff (X-) divides P(X) • Polynomial of degree d has d roots • 2 polynomials of degree d agree at d points • Interpolation: • Exists degree d polynomial with desired values at d+1 pts
Sharing a secret • I have a secret x in {0,1}. • I want to “distribute” it amongst 10 friends, so that no-one learns anything about the secret • But any 2 of them can together learn the secret. • Concretely • Want random variables y1, …, y10s.t. • Distribution of yi does not depend on x • yi, yj together determine x
Sharing a secret • Let F be a finite field • Let P(X) be uniformly random degree d poly over F • Fact: For distinct a1, …, ad+1: • P(a1), P(a2), … P(ad+1) are independent uniform random variables • [Shamir]: Let secret = x F • Pick P(X) uniform of degree 1, conditioned on P(0) = x • Shares: P(a1), P(a2), … P(a10) • Secrecy: P(a)is uniformly distributed. • Recoverability: P(a), P(b) jointly determine P, and hence P(0).
Secure Multiparty Computation • 3 parties A, B, C • 3 private inputs x, y, z in {0,1}n. • Want to jointly compute f(x, y, z). • Without revealing anything to the parties • except f(x, y, z) itself! • Assume all parties follow the protocol honestly • Fun example: f(x, y, z) = x + y + z
Secure Multiparty Computation • Which functions f can be computed securely? • Thm[Ben-Or, Goldwasser, Wigderson 88] +[Chaum,Crepeau, Damgard 88]All functions! • Main idea: • Try to end up with f(x, y, z) “secret shared” amongst parties • Work towards this by using an arithmetic circuit computing f • Basic underlying fact: • every f has an arithmetic circuit.
Secure Multiparty Computation • Start by distributing the bits of x, y, z to all parties • Via secret sharing • Compute secret shares of each wire of the circuitbottom-up • Operations: • Addition: • P, Q are uniformly random degree 1 polynomials s.t.P(0) = b, Q(0) = c • Then R = P+Q is a uniformly random degree 1 polynomial s.t.R(0) = b + c • Each party can compute evaluations of R from evaluations of P, Q • Multiplication: • Bit more delicate • R = PQ is NOT uniformly random degree 1 polynomial with R(0) = b c. • R is degree 2; so it is still determined by 3 evaluations • Need re-randomization + degree reduction
General Philosophy I • Evaluations of a random degree d polynomial • (d+1)-wise independent • gives secrecy • Can represent polynomials by their evaluations: • Can add and multiply polynomials in this representation! • Addition and multiplication capture all computation • Polynomials™ : they add and multiply
Lower bounds for constant depth circuits • AC0. • Constant depth circuits, polynomial size • AND, OR gates of unbounded fan-in • Furst-Saxe-Sipser: PARITY not in AC0. • AC0(PARITY) • Constant depth circuits, polynomial size • AND, OR, PARITY gates of unbounded fan-in • Razborov: • MAJORITY not in AC0(PARITY) • Beautiful approach based on polynomials • Smolensky: • Elegant simplification + generalizations
Razborov’s approximation • Key connection:Every function with small AC0(PARITY) circuits is computed by a “randomized polynomial” • If C has size s, depth d, there exists a distribution of polynomials P over F2 of degree (log (s/))ds.t. x {0,1}n, Pr [ C(x) P(x) ] .
Razborov’s approximation • If C has size s, depth d, there exists a distribution of polynomials P of degree (log (s/))ds.t. x {0,1}n, Pr [ C(x) P(x) ] . • Suffices to find a distribution of polynomials computing any gate, with: • error /s • degree log(s/) = log(1/) • For PARITY, degree = 1 • For OR (x1, …, xn): • Take d = log(1/) random sets S1, S2, …, Sd [n] P(x) = 1 -
Smolensky’sunapproximation • Majority cannot be computed by a distribution of low degree polynomials. • If it could, then there would be a low degree polynomial P s.t.Prx[ P(x) = MAJORITY(x) ] > 1 - • Versatility of MAJORITYEvery function f : {0,1}n {0,1} can be written as f(x) = g(x) AJORITY(x) + h(x)where g, h have degree n/2. • Then P can be used in place of MAJORITY. • Contradiction, because then every function gets a too low degree approximation.
General Philosophy II • Polynomials are mini-programs • Not totally trivial; can express some interesting complexity classes • Nevertheless, easier to understand their power • Because they are a “normal form” (like CNFs) • Leads to lower bounds for interesting classes. • Polynomials™: They Can’t Do Much
Verifying proofs • The Proof Checking Revolution [early 90s] • IP = PSPACE • MIP = NEXP • The PCP Theorem • Polynomials played a starring role • In addition to universality of polynomials: • The evaluation table of a polynomial is robust. • Can check properties of a polynomial by just peeking at the evaluation table.
Testing equality of polynomials • Given evaluation tables of polynomials P, Q • (m variate) • Test if P = Q. • Algorithm (Schwartz-Zippel-deMillo-Lipton): • Take a random x in Fm. • Check if P(x) = Q(x) • Basic lemma [Ore ‘22, …, S – Z – d – L ‘70s, … ]: • If P Q, and P, Q are degree d, then Pr[ P(x) = Q(x) ] d/|F|
The PCP theorem • Any language L in NP has a Probabilistically Checkable Proof • There is a randomized poly time verifier V(x, y) s.t. • V makes only O(1) queries to the “proof” y • If x in L, then there exists y s.t.Pr[V(x,y) accepts] = 1 • If x not in L, then for all y:Pr[V(x,y) accepts] < 0.1
Algebraic proofs of the PCP theorem • Reduce to an algebraically structured NP-complete Constraint Satisfaction Problem (CSP) • We will use: “Grid-CSP” • Use polynomials to give a PCP for this CSP
An example: Grid-CSP • A CSP on the grid [A] x [A]. • Given a collection of constraints:Cij(b1, b2, b3, b4, b5)for each i, j in [A] • Want: • A function f: [A] x [A] {0,1} s.t.Cij(f(i,j), f(i-1, j), f(i,j-1), f(i+1, j), f(i, j+1))= 0for each i, j in [A] • Standard NP proof: write all the values of f.
A PCP for Grid-CSP • Pick a prime p 100A • Let F be the field Fp (integers mod p) • We will use polynomials over F.
A PCP for Grid-CSP • The PCP: • Consider [A] F • Consider {0,1} F • We have f: [A]2 {0,1} • Low degree extension: • There is a 2-variable polynomial of degree < A: • P(X,Y) • s.t. P(i,j) = f(i,j) for each i, j in [A]. • PCP: Write down P(x, y) for each x, y in F2. • Now: • To verify that P is the low degree extension of a satisfying assignment
A PCP for Grid-CSP • Need to verify: • Given table is indeed close to a low degree polynomial P • P(i,j) in {0,1} for each i,j in [A] • Cij (P(i,j), P(i-1, j), P(i,j-1), P(i+1, j), P(i, j+1))= 0, for each i,j in [A] • Express these in terms of polynomials: • Define B(X,Y) = P(X,Y) ( P(X,Y) – 1) • want B(i,j) = 0 for each i,j in [A] • Define constraint interpolating polynomial Q(X, Y, Z1, Z2, Z3, Z4, Z5) s.t.: Q(i, j, b1, b2, b3, b4, b5) = Cij(b1, b2, b3, b4, b5) • Define C(X,Y) = Q(X, Y, P(X, Y), P(X-1, Y), P(X, Y-1), P(X+1, Y) P(X, Y+1)) • want C(i,j) = 0 for each i, j in [A] • Both B(X,Y) and C(X,Y) are low degree polynomials, • Want to verify that they are zero on the set [A] x [A]
A PCP for Grid-CSP • Basic problems we need to solve: • Low degree testing for polynomials: Given a function, verify that it is close to a low degree polynomial • Zero testing for polynomials: Given a function close to some polynomial Q(X,Y), verify that Q(i,j) = 0 for each i,j in [A].
Low degree testing for polynomials • Given access to f: Fm F, want to test if it is close to degree d • Idea: • restriction of low-degree multivariate polynomials to lines/planes/…give low-degree univariate/bivariate/… polynomials • Simple, natural test: • Pick a random line L in Fm. • Verify that f|L is degree d • Serious Thm [Rubinfeld-Sudan, …]: • If this test passes w.p. 0.99 • Then f agrees with some polynomial on 0.99 fraction of Fm. • Query complexity = n1/m • In our example: O().
Zero testing for polynomials • “Combinatorial Nullstellensatz” [Alon, Ben-Sasson– Sudan] • Q(X,Y) vanishes on S x S if and only if: • U(X,Y), V(X,Y) of low degree s.t. Q(X,Y) = W(X) U(X,Y) + W(Y) V(X,Y)where W(T) = • PCP: write down the evaluation tables of U, V • Verification: • Check that U, V are close to low degree • Pick a random point (x,y) in F2. • Check that Q(x,y) = W(x) U(x, y) + W(y) V(x, y)
Overall PCP • Proof length: A log A bits • Verification with O() queries • Changing 2-variables to polylog(n) variables • Proof: poly(A) bits long • Verification: polylog(n) queries • Full PCP Theorem needs two further ideas: • Proof composition • Constant query PCP (of arbitrary length)
General Philosophy III • Polynomials extend the domain of functions • They can be evaluated at “illegal” inputs • These extended values have a lot of local structure • Inconsistencies propagate everywhere • Enable local testing/decoding • Verifying a proof is about checking local consistency • Polynomials™: They’re local
Some things we didn’t see • The Permanent • Celebrity polynomial • #P complete function • Featured in many influential complexity results: • [Lipton]: Hardness amplification, Local decoding • [LFKN]: Interactive proofs for #P • Applications to coding theory and pseudorandomness • elegant code/pseudorandom object constructions • deadly applications to complexity theory • Extreme hardness amplification, pseudorandom generators
Some things we didn’t see • Applications to boolean functions • Fourier analysis, hypercontractivity, etc… • Approximate real degree decision tree complexity • Quantum lower bounds • Polynomial analogues • Arithmetic circuits • Powerful arithmetic circuit lower bound techniques
Some things we didn’t see • (Just) beyond polynomials • Algebraic functions on algebraic curves (Algebraic-Geometric codes) • Often improve upon polynomial-based applications • Error-correcting codes better than random codes • Linear length PCPs • Epsilon-biased sets • Multiplicities • Often it helps to consider polynomials + their derivatives • Count not just #zeros, but also multiplicity of zeros • Randomness extractors, locally decodable codes, …