1 / 38

CS151 Complexity Theory

CS151 Complexity Theory. Lecture 11 May 7, 2013. Min-entropy. General model of physical source w/ k < n bits of hidden randomness. 2 k strings. string sampled uniformly from this set. {0,1} n. Definition : random variable X on {0,1} n has min-entropy min x –log(Pr[X = x])

janus
Download Presentation

CS151 Complexity Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS151Complexity Theory Lecture 11 May 7, 2013

  2. Min-entropy • General model of physical source w/ k < n bits of hidden randomness 2kstrings string sampled uniformly from this set {0,1}n • Definition: random variable X on {0,1}n has min-entropyminx –log(Pr[X = x]) • min-entropy k implies no string has weight more than 2-k

  3. Extractor • Extractor: universal procedure for “purifying” imperfect source: • E is efficiently computable • truly random seed as “catalyst” source string 2kstrings E near-uniform seed m bits {0,1}n t bits

  4. Extractor “(k, ε)-extractor”  for all X with min-entropy k: • output fools all circuits C: |Prz[C(z) = 1] - Pry, xX[C(E(x, y)) = 1]| ≤ ε • distributions E(X, Ut), Um “ε-close” (L1 dist ≤ 2ε) • Notice similarity to PRGs • output of PRG fools all efficient tests • output of extractor fools all tests

  5. Extractors • Goals: good: best: short seed O(log n) log n+O(1) long output m = kΩ(1) m = k+t–O(1) many k’s k = nΩ(1) any k = k(n) source string 2kstrings E near-uniform seed m bits {0,1}n t bits

  6. Extractors • random function for E achieves best ! • but we need explicit constructions • many known; often complex + technical • optimal extractors still open • Trevisan Extractor: • insight: use NW generator with source string in place of hard function • this works (!!) • proof slightly different than NW, easier

  7. Trevisan Extractor • Ingredients: ( > 0, m are parameters) • error-correcting code C:{0,1}n  {0,1}n’ distance (½ - ¼m-4)n’ blocklength n’ = poly(n) • (log n’, a = δlog n/3) design: S1,S2,…,Sm  {1…t = O(log n’)} E(x, y)=C(x)[y|S1]◦C(x)[y|S2]◦…◦C(x)[y|Sm]

  8. Trevisan Extractor E(x, y)=C(x)[y|S1]◦C(x)[y|S2]◦…◦C(x)[y|Sm] Theorem (T): E is an extractor for min-entropy k = nδ, with • output length m = k1/3 • seed length t = O(log n) • error ε ≤ 1/m C(x): 010100101111101010111001010 seed y

  9. Trevisan Extractor • Proof: • given X  {0,1}n of size 2k • assume E fails to ε-pass statistical test C |Prz[C(z) = 1] - PrxX, y[C(E(x, y)) = 1]| > ε • distinguisher C predictor P: PrxX, y[P(E(x, y)1…i-1)=E(x, y)i] > ½ + ε/m

  10. Trevisan Extractor • Proof (continued): • for at least ε/2 of x  X we have: Pry[P(E(x, y)1…i-1)=E(x, y)i] > ½ + ε/(2m) • fix bits , outside of Si to preserve advantage Pry’[P(E(x;y’)1…i-1)=C(x)[y’] ] >½ + ε/(2m) • as vary y’, for j ≠ i, j-th bit of E(x; y’) varies over only 2avalues • (m-1) tables of 2a values supply E(x;y’)1…i-1

  11. Trevisan Extractor output C(x)[y’] w.p. ½ + ε/(2m) Y’  {0,1}log n’ P y’

  12. Trevisan Extractor • Proof (continued): • (m-1) tables of size 2a constitute a description of a string that has ½ + ε/(2m) agreement with C(x) • # of strings x with such a description? • exp((m-1)2a) = exp(nδ2/3) = exp(k2/3) strings • Johnson Bound: each string accounts for at most O(m4) x’s • total #: O(m4)exp(k2/3) << 2k(ε/2) • contradiction

  13. Extractors Trevisan: k = n± t = O(log n) m = k1/3 ² = 1/m • (k, )- extractor: • E is efficiently computable • 8 X with minentropy k, E fools all circuits C: |Prz[C(z) = 1] - Pry, xX[C(E(x, y)) = 1]| ≤ ε source string 2kstrings E near-uniform seed m bits {0,1}n t bits

  14. Strong error reduction • L BPP if there is a p.p.t. TM M: x  L  Pry[M(x,y) accepts] ≥ 2/3 x  L  Pry[M(x,y) rejects] ≥ 2/3 • Want: x  L  Pry[M(x,y) accepts] ≥ 1 - 2-k x  L  Pry[M(x,y) rejects] ≥ 1 - 2-k • We saw: repeat O(k) times • n = O(k)·|y| random bits; 2n-k badstrings Want to spend n = poly(|y|) random bits; achieve << 2n/3bad strings

  15. Strong error reduction • Better: • E extractor for minentropy k=|y|3=nδ, ε < 1/6 • pick random w  {0,1}n, run M(x, E(w, z)) for all z  {0,1}t, take majority • call w “bad” if majzM(x, E(w, z)) incorrect |Prz[M(x,E(w,z))=b] - Pry[M(x,y)=b]| ≥ 1/6 • extractor property: at most 2k bad w • n random bits; 2nδbad strings

  16. RL • Recall: probabilistic Turing Machine • deterministic TM with extra tape for “coin flips” • RL(Random Logspace) • L RL if there is a probabilistic logspace TM M: x  L  Pry[M(x,y) accepts] ≥ ½ x  L  Pry[M(x,y) rejects] = 1 • important detail #1: only allow one-way access to coin-flip tape • important detail #2: explicitly require to run in polynomial time

  17. RL • L  RL NL  SPACE(log2 n) • Theorem (SZ) : RL  SPACE(log3/2 n) • Belief: L = RL (open problem)

  18. RL L  RL NL • Natural problem: Undirected STCONN: given an undirected graph G = (V, E), nodes s, t, is there a path from s  t? Theorem: USTCONN  RL. (Recall: STCONN is NL-complete)

  19. Undirected STCONN • Proof sketch: (in Papadimitriou) • add self-loop to each vertex (technical reasons) • start at s, random walk 2|V||E| steps, accept if see t • Lemma: expected return time for any node i is 2|E|/di • suppose s=v1, v2, …, vn=t is a path • expected time from vi to vi+1 is (di/2)(2|E|/di) = |E| • expected time to reach vn ≤ |V||E| • Pr[fail reach t in 2|V||E| steps] ≤ ½ • Reingold 2005: USTCONN  L

  20. A motivating question • Central problem in logic synthesis: • Complexity of this problem? • NP-hard? in NP? in coNP? in PSPACE? • complete for any of these classes?  • given Boolean circuit C, integer k • is there a circuit C’ of size at most k that computes the same function C does?      x1 x2 x3 … xn

  21. Oracle Turing Machines • Oracle Turing Machine (OTM): • multitape TM M with special “query” tape • special states q?, qyes, qno • on input x, with oracle language A • MA runs as usual, except… • when MA enters state q?: • y = contents of query tape • y  A  transition to qyes • y  A  transition to qno

  22. Oracle Turing Machines • Nondeterministic OTM • defined in the same way • (transition relation, rather than function) • oracle is like a subroutine, or function in your favorite programming language • but each call counts as single step e.g.: given φ1, φ2, …, φn are even # satisfiable? • poly-time OTM solves with SAT oracle

  23. Oracle Turing Machines Shorthand #1: • applying oracles to entire complexity classes: • complexity class C • language A CA = {L decided by OTM M with oracle A with M “in” C} • example: PSAT

  24. Oracle Turing Machines Shorthand #2: • using complexity classes as oracles: • OTM M • complexity class C • MCdecides language L if for some language A  C, MA decides L Both together:CD =languages decided by OTM “in” C with oracle language from D exercise: show PSAT = PNP

  25. The Polynomial-Time Hierarchy • can define lots of complexity classes using oracles • the classes on the next slide stand out • they have natural complete problems • they have a natural interpretation in terms of alternating quantifiers • they help us state certain consequences and containments(more later)

  26. The Polynomial-Time Hierarchy Σ0= Π0 =P Δ1=PP Σ1=NPΠ1=coNP Δ2=PNP Σ2=NPNPΠ2=coNPNP Δi+1=PΣiΣi+i=NPΣiΠi+1=coNPΣi Polynomial HierarchyPH = iΣi

  27. The Polynomial-Time Hierarchy Σ0= Π0 = P Δi+1=PΣiΣi+i=NPΣiΠi+1=coNPΣi • Example: • MIN CIRCUIT: given Boolean circuit C, integer k; is there a circuit C’ of size at most k that computes the same function C does? • MIN CIRCUIT  Σ2

  28. The Polynomial-Time Hierarchy Σ0= Π0 = P Δi+1=PΣiΣi+i=NPΣiΠi+1=coNPΣi • Example: • EXACT TSP: given a weighted graph G, and an integer k; is the k-th bit of the length of the shortest TSP tour in G a 1? • EXACT TSP  Δ2

  29. The PH EXP PSPACE PSPACE: generalized geography, 2-person games… 3rd level: V-C dimension… 2nd level: MIN CIRCUIT, BPP… 1st level: SAT, UNSAT, factoring, etc… PH Σ3 Π3 Δ3 Σ2 Π2 Δ2 NP coNP P

  30. Useful characterization • Recall: L  NP iff expressible as L = { x | 9y, |y| ≤ |x|k, (x, y)  R } where R  P. • Corollary: L  coNP iff expressible as L = { x | 8y, |y| ≤ |x|k, (x, y)  R } where R  P.

  31. Useful characterization Theorem: L  Σi iff expressible as L = { x | 9 y, |y| ≤ |x|k, (x, y)  R } where R  Πi-1. • Corollary: L  Πi iff expressible as L = { x | 8 y, |y| ≤ |x|k, (x, y)  R } where R  Σi-1.

  32. Useful characterization Theorem: L  Σi iff expressible as L = { x | 9 y, |y| ≤ |x|k, (x, y)  R }, where R  Πi-1. • Proof of Theorem: • induction on i • base case (i =1) on previous slide ( ) • we know Σi=NPΣi-1 = NPΠi-1 • guess y, ask oracle if (x, y)  R

  33. Useful characterization Theorem: L  Σi iff expressible as L = { x | 9 y, |y| ≤ |x|k, (x, y)  R }, where R  Πi-1. ( ) • given L  Σi = NPΣi-1decided by ONTM M running in time nk • try: R = { (x, y) : y describes valid path of M’s computation leading to qaccept } • but how to recognize valid computation path when it depends on result of oracle queries?

  34. Useful characterization Theorem: L  Σi iff expressible as L = { x | 9 y, |y| ≤ |x|k, (x, y)  R }, where R  Πi-1. • try: R = { (x, y) : y describes valid path of M’s computation leading to qaccept } • valid path = step-by-step description including correctyes/no answer for each A-oracle query zj (A  Σi-1) • verify “no” queries in Πi-1: e.g: z1 A  z3 A  …  z8 A • for each “yes” query zj:9 wj, |wj| ≤ |zj|kwith (zj, wj)  R’ for some R’  Πi-2by induction. • for each “yes” query zj put wj in description of path y

  35. Useful characterization Theorem: L  Σi iff expressible as L = { x | 9 y, |y| ≤ |x|k, (x, y)  R }, where R  Πi-1. • single language R in Πi-1 : (x, y) R  all “no” zj are not in A and all “yes” zj have (zj, wj)  R’ and y is a path leading to qaccept. • Note: AND of polynomially-many Πi-1 predicates is in Πi-1.

  36. Alternating quantifiers Nicer, more usable version: • LΣi iff expressible as L = { x | 9y1 8y29y3 …Qyi (x, y1,y2,…,yi)R } where Q= 8/9if i even/odd, and RP • LΠi iff expressible as L = { x | 8y19y2 8y3 …Qyi (x, y1,y2,…,yi)R } where Q= 9/8 if i even/odd, and RP

  37. Alternating quantifiers • Proof: • ( )induction on i • base case: true for Σ1=NP and Π1=coNP • consider LΣi: L = {x | 9y1 (x, y1)  R’ }, for R’  Πi-1 L = {x | 9y1 8y2 9y3 …Qyi ((x, y1), y2,…,yi)R} L = {x | 9y1 8y2 9y3 …Qyi (x, y1,y2,…,yi)R} • same argument for L Πi • ( ) exercise.

  38. Alternating quantifiers Pleasing viewpoint: “9898989…” PSPACE const. # of alternations poly(n) alternations PH Δ3 Σ2 Π2 “98” “89” Σi Πi “989…” “898…” Δ2 Σ3 Π3 “898” “989” NP coNP “8” “9” P

More Related