1 / 43

Pseudorandomness from Shrinkage

Pseudorandomness from Shrinkage. David Zuckerman University of Texas at Austin Joint with Russell Impagliazzo and Raghu Meka. Randomness and Computing. Randomness extremely useful in computing. Randomized algorithms Monte Carlo simulations Cryptography Distributed computing

tab
Download Presentation

Pseudorandomness from Shrinkage

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Pseudorandomness from Shrinkage David Zuckerman University of Texas at Austin Joint with Russell Impagliazzo and Raghu Meka

  2. Randomness and Computing • Randomness extremely useful in computing. • Randomized algorithms • Monte Carlo simulations • Cryptography • Distributed computing • Problem: high-quality randomness expensive.

  3. What is minimal randomness requirement? • Can we eliminate randomness completely? • If not: • Can we minimize quantity of randomness? • Can we minimize quality of randomness? • What does this mean?

  4. What is minimal randomness requirement? • Can we eliminate randomness completely? • If not: • Can we minimize quantity of randomness? • Pseudorandom generator • Can we minimize quality of randomness? • Randomness extractor

  5. Pseudorandom Numbers • Computers rely on pseudorandom generators: PRG 141592653589793238 71294 long “random-enough” string short random string What does “random enough” mean?

  6. Modern Approach to PRGs[Blum-Micali 1982, Yao 1982] Require PRG to “fool” all efficient algorithms. Alg random ≈ same behavior Alg pseudorandom

  7. Which efficient algorithms? • Most functions fool all polynomial-time circuits. • Construct explicitly? • Poly-time PRG fooling all polynomial-time circuits implies NP≠P. • So either: • Make unproven assumption. • Try to fool interesting subclasses of algorithms.

  8. Two Major Challenges • Prove circuit lower bounds. • EXP does not have poly-size circuits. • Derandomize algorithms. • Hardness vs. Randomness paradigm • (1) implies (2) [Nisan-Wigderson, BFNW,…] • Almost equivalent [Kabanets-Impagliazzo …]

  9. Pseudorandom Generators random seed pseudorandom PRG • PRG fools class F of functions if |Pr[f(Un)=1] - Pr[f(PRG(Ud))=1]| ≤ ε. • Cryptography: e.g., F=BPTIME(nlog n). • Equivalent to one-way functions [HILL]. • Derandomizing BPP: F=nc-size circuits. • Need unproven lower bound assumptions. • What F, d without unproven assumptions? d n

  10. Pseudorandom Generators random seed pseudorandom PRG • PRG fools class F of functions if |Pr[f(Un)=1] - Pr[f(PRG(Ud))=1]| ≤ ε. • PRG fooling{f | sizeM(f)≤s} with seed length s1/c implies g in NP with sizeM(g)≥≈nc. • Can we achieve converse: does g in P with sizeM(g)≥nc imply PRG with seed of length ≈ s1/c? • Previous work gives nothing in this case. d n

  11. New Results • Construct such near optimal PRGs if lower bound is proved via “shrinkage.” • Obtain following seed lengths to fool size s, error = 1/poly. • Formulas over {∨,∧,NOT}: s1/3+o(1) • Formulas over arbitrary basis: s1/2+o(1) • Read-once formulas over {∨,∧,NOT}: s.234… • Branching programs: s1/2+o(1)

  12. Previous Work • Seed length (1-α)n fooling read-once formulas and read-once branching programs of width 2αn, α>0 small enough constant. [Bogdanov, Papakonstantinou, Wan]. • For ROBPs reading bits in known order, seed length O(log2 n) [Nisan,…].

  13. Random Restrictions • Choose random restriction ρ, fraction p unset. • E[size(f|ρ)] ≤ p size(f), size(formula)= # leaves. • Whpsize(f|ρ) ≤ 2p size(f). • Holds even if ρ chosen k-wise independently.

  14. Shrinkage Exponent • Random ρ, fraction p unset. Shrinkage Γ: E[size(f|ρ)] = O(pΓ s). • Example: Formulas. • Formulas over arbitrary basis: Γ = 1. • Formulas over DM={∨,∧,NOT}: Γ = 2 [Subbotovskaya ‘61, …., Hastad ‘93] • Read-once formulas over DM: Γ = 3.27… [Paterson-Zwick ‘91, Hastad-Razborov-Yao ‘95] • General circuits: Γ = 0.

  15. Branching Programs n+1 layers • Layered, ordered, read-once BPs needed for PRG for Space • Size = # edges ≤ 2wn. • Γ = 1: size of shrunken BP proportionally to |{unfixed var’s}|. • |{layered, ordered ROBPs}| ≤ w2wn. • We consider arbitrary BPs, reading bits in arbitrary order. 0 1 x2 width w 1 x1 acc 0 rej

  16. PRGs from Shrinkage • Random ρ, fraction p unset. Shrinkage Γ: E[size(f|ρ)] = O(pΓ s). • Shrinkage Γ nΓ+1/polylog(n) lower bounds [Andreev]. • Main theorem: High probability shrinkage Γwrt pseudorandom restrictions gives PRG with seed length s1/(Γ+1) + o(1). • Showing shrinkage wrt pseudorandom restrictions is nontrivial when Γ ≠ 1.

  17. Outline • Background on Randomness Extractors • New Theorem about Old PRG • New PRG • Correctness Proof • Pseudorandom Restrictions • Conclusions

  18. Weak Random Source […CG ‘85 Z ‘90] • Random variable X on {0,1}r. • General model: min-entropy • Flat source: • Uniform on A, |A| ≥ 2k. {0,1}r |A| ³ 2k

  19. How Arise in PRGs • Condition on information • E.g., TM configuration • Uniform X in {0,1}r, f:{0,1}r{0,1}b. • f regular: H∞(X|f(X) = a) = r - b. • Any f: Pra=f(X’)[H∞(X|f(X) = a) ≥ r – b – Δ] ≥ 1-2-Δ.

  20. Goal: Extract Randomness m bits r bits Ext statistical error  Problem: Impossible, even for k=r-1, m=1, ε<1/2.

  21. Impossibility Proof • Suppose f:{0,1}r{0,1} satisfies ∀sources X with H∞(X) ≥ r-1, f(X) ≈ U. f-1(0) f-1(1) Take X=f-1(0)

  22. Randomness Extractor: short seed[Nisan-Z ‘93,…, Guruswami-Umans-Vadhan ‘07] d=O(log (r/ε)) random bit seed Y m =.99k bits r bits Ext statistical error 

  23. Extractor-Based PRG for Read-Once Branching Programs [Nisan-Z ‘93] • Basic PRG: G(x, y1,…,yt)=Ext(x,y1)…Ext(x,yt) • Parameters: r = |x| = 2√n d = |yi| = O(log n) t = m = |Ext(x,yi)| = √n

  24. PRG for Ordered Read-Once BPs n+1 layers • G(x, y1,…, yt)=Ext(x,y1)…Ext(x,yt) • Condition on v reached after reading up to Ext(X,Yi-1). • Whp H∞(X|reach v) ≥ |x| – log w - Δ. • Hence (Ext(X,Yi)|reach v) ≈ uniform. 0 1 z2 width w v 1 z1 acc 0 rej

  25. New: Same PRG works if bits read in any order n+1 layers • z1,z2,…,zm can appear anywhere. • Still, after fixing all zi, i>m, restricted function is a ROBP on z1,z2,…,zmread in the same order as original ROBP. 0 1 z26 width w 1 z41 acc 0 rej

  26. New: Same PRG works if bits read in any order n+1 layers • Still, after fixing all zi, i>m, restricted function is a ROBP on z1,z2,…,zmread in the same order as original ROBP. • Information = lg(# restricted functions) = lg(w2wm) 0 1 z26 width w 1 z41 acc 0 rej

  27. New: Works if bits read in any order • PRG: G(x, y1,…,yt)=Ext(x,y1)…Ext(x,yt)=z1…zn • BP could read in order z12z7z8… • D=distribution of PRG output, U=Unif({0,1}n). • Suppose |Pr[f(D)=1] – Pr[f(U)=1]| > δ. • Let Zi=Ext(X,Yi), Ui =Unif({0,1}m) • Z1=z1z2…zm,Z2=zm+1…z2m,… • Bits in Zican appear anywhere.

  28. New: Works if bits read in any order • PRG: G(x, y1,…,yt)=Ext(x,y1)…Ext(x,yt). • D=distribution of PRG output, U=Unif({0,1}n). • Suppose |Pr[f(D)=1] – Pr[f(U)=1]| > δ. • Let Zi=Ext(X,Yi), Ui =Unif({0,1}m). • Hybrid argument. • Let Di = (U1,…,Ui,Zi+1,…,Zt). D0=D, Dt=U. • Exists i: |Pr[f(Di)=1] – Pr[f(Di-1=1)]| > δ/t. • Changing Zi=Ext(X,Yi) to Ui changes Pr[accept].

  29. New: Works if bits read in any order • Exists i: |Pr[f(Di)=1] – Pr[f(Di-1=1)]| > δ/t. • Changing Zi=Ext(X,Yi) to Ui changes Pr[accept]. • Consider ρ = (Z1,…,Zi-1,**…*,Ui+1,…,Ut) • Then g = f|ρ is a ROBP on m bits. • f(Di)=g(Zi), f(Di-1)=g(Ui). Goal: whp g(Zi) ≈ g(Ui). • Only w2wm possibilities for g. • Whp, H∞(X|G=g) ≥ r – 2mw log w - Δ. • Whp, conditioned G=g, Ext(X,Yi) ≈ Ui.

  30. General Branching Programs • Even PRG for unordered ROBPs is new • Our seed length is O(√(wn) log n) • Previous was (1-α)n [Bogdanov, Papakonstantinou, Wan] • Known order: O(log2 n) [Nisan,…]. • What if not read once? • Some variables could be read many times. • Pseudorandomly permute variables before construction. • Gives seed length size(f)½+o(1). • What about formulas? General reduction?

  31. General PRG Construction • Assume have pseudorandom restrictions which give shrinkage Γwhp. ρ1 = 0 1 * 1 1 0 1 1 * 0 0 1 0 * 0 1 0 0 1 1 1 ρ2 = 0 0 1 0 1 0 * 0 1 1 0 1 * 0 1 1 0 * * 1 0 … ρt = * 0 1 0 1 1 * 1 * 0 0 1 0 0 0 1 * 0 1 1 1 • Set t=c(log n)/p so whp all columns have *.

  32. General PRG Construction ρ1 = 0 1 * 1 1 0 1 1 * 0 0 1 0 * 0 1 0 0 1 1 1 ρ2 = 0 0 1 0 1 0 * 0 1 1 0 1 * 0 1 1 0 * * 1 0 … ρt = * 0 1 0 1 1 * 1 * 0 0 1 0 0 0 1 * 0 1 1 1 • Choose X, Y1,…,Yt randomly. • Replace *’s in ith row with Ext(X,Yi). • PRG output = XOR of resulting strings.

  33. Correctness Proof • D=distribution of PRG output, U=uniform. • Suppose |Pr[f(D)=1] – Pr[f(U=1)]| > δ. • Let Zi=Ext(X,Yi). Hybrid argument. • Change Z1,…,Zi to U1,…,Ui to get Di. • Dt ≈ U: Whp *’s cover all columns. • Exists i: |Pr[f(Di)=1] – Pr[f(Di-1=1)]| > δ/t. • Changing Zito Uichanges Pr[f accepts].

  34. Correctness Proof • Exists i: changing Zi=Ext(X,Yi) to Ui changes Pr[f accepts]. • Fix everything but ρ=ρi, Zi, Ui. Let v = ithrow. • Let fi(v) = f(v+w), w = XOR of rows except ith. • Let g = fi|ρ, so g(v|A) = fi (v) , A = *’s of ρ. • f(Di)=g(Zi), f(Di-1)=g(Ui). Goal: whp g(Zi) ≈ g(Ui). • E=event that size(g) ≤ s=cpΓ size(fi). Pr[E] ≥ 1-ε. • Conditioned on E, gdescribable by b ≈ s log s bits. • Whp, H∞(X|E,G=g) ≥ r – b - Δ. • Whp conditioned on Eand G=g, Ext(X,Yi) ≈ Ui.

  35. Improving the PRG • To get nearly optimal output length for Γ > 1, replace *’s with Gk-wise(Ext(X,Yi)).

  36. Pseudorandom Restrictions • Need pseudorandom restrictions that yield shrinkage. • BPs and formulas over arbitrary basis: • clog n wise independence suffices. • Deal with heavy variables separately. • Formulas over {∧,∨,NOT}, incl. read-once: • More work. • Hastad and Hastad-Razborov-Yao as black boxes. • They only guarantee shrinkage in expectation for truly random restrictions.

  37. Proof Idea Decompose formula: O(n/k) subformulas of size ≤k=no(1). Use k2-wise independence. Goal: p ≈ n-1/(Γ+1). Too small here. Instead, shrink by q≈ k-.1 and iterate.

  38. Unrestrictable inputs • Many subformulas have inputs that must = *. • Does shrinkage for random restrictions imply shrinkage when some inputs must = *? • Further decomposition: each subformula has ≤ 2 such inputs. • h such inputs increase size by ≤ 2h. • For each setting of variables have subformula. • Combine with selector formula.

  39. Read-Once Formulas • Need different trick for read-once formula. • g small but unlikely to shrink to nothing. g g * *

  40. Dependencies • Read-once case: k-wise independence. • Read-t case: Consider independent sets in dependency graph on subformulas. • General case: tricky dependencies.

  41. Conclusions • New, extractor-based PRG based on shrinkage. • Without improving lower bounds, essentially best possible PRGs for: • Formulas over {∨,∧,NOT}: s1/3+o(1) seed length. • Formulas over arbitrary basis: s1/2+o(1) • Read-once formulas over {∨,∧,NOT}: s.234… • Branching programs: s1/2+o(1)

  42. Open Questions • Better PRGs for unordered ROBPs? • Can we recurse somehow? • Subsequent work: Reingold-Steinke-Vadhan give O(log2 n) seed for unordered permutation ROBPs. • PRGs from other lower bound techniques? • Subsequent work: Trevisan-Xue on PRGs for AC0. • Improve lower bounds? • Our PRG gives alternate function f: formula-size(f) ≥ n3-o(1), matching Hastad/Andreev. • Subsequent: average-case lower bound of n3-o(1) [Komargodski-Raz-Tal] (improving [Komargodski-Raz])

  43. Thank you!

More Related