360 likes | 512 Views
Better Pseudorandom Generators from Milder Pseudorandom Restrictions. Raghu Meka (IAS) Parikshit Gopalan, Omer Reingold (MSR-SVC) Luca Trevian (Stanford), Salil Vadhan (Harvard). Can we generate random bits?. Can we generate random bits?. Pseudorandom Generators.
E N D
Better Pseudorandom Generators from Milder Pseudorandom Restrictions Raghu Meka (IAS) Parikshit Gopalan, Omer Reingold (MSR-SVC) Luca Trevian (Stanford), Salil Vadhan (Harvard)
Pseudorandom Generators Stretch bits to fool a class of “test functions” F
Can we generate random bits? • Complexity theory, algorithms, streaming • Strong positive evidence: hardness vs randomness – NW94, IW97, … • Unconditionally? Duh.
Can we generate random bits? • Restricted models: bounded depth circuits (AC0), bounded space algorithms Nis91, Bazzi09, B10, … Nis90, NZ93, INW94, …
PRGs for AC0 For polynomially small error best was even for read-once CNFs.
PRGs for Small-space For polynomially small error best was even for comb. rectangles.
This Work PRGs with polynomial small error
Why Small Error? • Because we “should” be able to • Symptomatic: const. error for large depth implies poly. error for smaller depth • Applications: algorithmic derandomizations, complexity lowerbounds
This Work 1. PRG for comb. rectangles with seed . 2. PRG for read-once CNFs with seed . 3. HSG for width 3 branching programs with seed . Generic new technique: iterative application of mild random restrictions.
Combinatorial Rectangles Applications: Number theory, analysis, integration, hardness amplification
PRGs for Comb. Rectangles Small set preserving volume Volume of rectangle ~ Fraction of positive PRG points
PRGs for Combinatorial Rectangles Thm: PRG for comb. rectangles with seed .
Read-Once CNFs Thm: PRG for read-once CNFs with seed . Each variable appears at most once
This Talk Thm: PRG for read-once CNFs with seed . Comb. Rectangles similar but different
Outline Main generator: mild (pseudo)random restrictions. Interlude: Small-bias spaces, Tribes Analysis: variance dampening, approximating symmetric functions. The “real” stuff happens here.
Random Restrictions • Switching lemma – Ajt83, FSS84, Has86 * 1 * 0 * 0 * 1 * 0 * 0 * * *
PRGs from Random Restrictions • AW85: Use “pseudorandom restrictions”. * * * * * * * * * • Problem: No strong derandomized switching lemmas.
Mild Psedorandom Restrictions • Restrict half the bits (pseudorandomly). 0 0 1 0 0 0 0 0 0 * * * * * * * * * * * * * * * * * * “Simplification”: Can be fooled by small-bias spaces.
Full Generator Construction Repeat Randomness: Pick half using almost k-wise * * * * * * * * * * * * * * Small-bias Small-bias Small-bias Thm: PRG for read-once CNFs with seed .
Outline Main generator: mild (pseudo)-random restrictions. Interlude: Small-bias spaces, Tribes Analysis: variance dampening, approximating symmetric functions.
Toy example: Tribes Read-once CNF and a Comb. Rectangle
Small-bias Spaces • Fundamental objects in pseudorandomness • NN93, AGHP92: can sample with bits
Small-bias Spaces • PRG with seed • Tight: need bias
Outline Main generator: mild (pseudo)-random restrictions. Interlude: Small-bias spaces, Tribes Analysis: variance dampening, approximating symmetric functions. The “real” stuff happens here.
Analysis Sketch Pick half using almost k-wise * * * * * * * * * * * * * * * * * * * * * * Small-bias Uniform Small-bias Small-bias Error is small Size reduces:
Analysis for Tribes * * * * * * • First try: fix uniform bits (averaging argument) • Problem: still Tribes * * * Pick half using almost k-wise Pick exactly half from each clause 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 1 * * * * * * * * * * * * * * * * * * * * * * * * * * * Main idea: Average over uniform to study “bias function”. White = small-bias Yellow = uniform
Fooling Bias Functions • Fix a read-once CNF f. Want: • Define bias function: False if we fixed X!
Fooling Bias Functions • Let
Fooling Bias Functions (Without “dampening”) “Variance dampening”: makes things work.
Fooling Bias Functions • F’s fooled by small-bias • ’s decrease geometrically under uniform • Nosuch decrease for small-bias • Conditional decrease: decrease conditioned on a high probability event (cancellations happen) : ’th symmetric polynomial
An Inequality for Symmetric Polynomials Lem: Comes from variance dampening. Proof uses Newton-Girard identities. Ex: If then
Summary Main generator: mild (pseudo)-random restrictions. Small-bias spaces and Tribes Analysis: variance dampening, approximating sym. functions. Combinatorial rectangles similar but different PRG for RCNFs
Open Problems Q: Use techniques for other classes? Small-space?
Thank you • “The best throw of the die is to throw it away” • -