710 likes | 721 Views
Randomness Extractors & their Many Guises. Salil Vadhan Harvard University. to be posted at http://eecs.harvard.edu/~salil. I. Motivation. Original Motivation [SV84,Vaz85,VV85,CG85,Vaz87,CW89,Zuc90,Zuc91]. Randomization is pervasive in CS
E N D
Randomness Extractors& their Many Guises Salil Vadhan Harvard University to be posted at http://eecs.harvard.edu/~salil
Original Motivation[SV84,Vaz85,VV85,CG85,Vaz87,CW89,Zuc90,Zuc91] • Randomization is pervasive in CS • Algorithm design, cryptography, distributed computing, … • Typically assume perfect random source. • Unbiased, independent random bits • Unrealistic? • Can we use a “weak” random source? • Source of biased & correlated bits. • More realistic model of physical sources. • (Randomness) Extractors: convert a weak random source into an almost-perfect random source.
Applications of Extractors • Derandomization of BPP [Sip88,GZ97,MV99,STV99] • Derandomization of logspace algorithms [NZ93,INW94,RR99,GW02] • Distributed & Network Algorithms[WZ95,Zuc97,RZ98,Ind02]. • Hardness of Approximation [Zuc93,Uma99,MU01] • Cryptography [CDHKS00,MW00,Lu02] • Data Structures [Ta02]
The Unifying Role of Extractors Extractors can be viewed as types of: • Hash Functions • Expander Graphs • Samplers • Pseudorandom Generators • Error-Correcting Codes Unify the theory of pseudorandomness.
This Tutorial • Is framed around connections between extractors & other objects. We’ll use these to: • Gain intuition for the definition. • Describe a few applications. • Hint at the constructions. • Many omissions. For further reading: • N. Nisan and A. Ta-Shma. Extracting randomness: a survey and new constructions. Journal of Computer & System Sciences, 58 (1):148-173, 1999. • R. Shaltiel. Recent developments in explicit constructions of extractors. Bulletin of EATCS, 77:67-95, June 2002. • S. Vadhan. Course Notes for CS225: Pseudorandomness. http://eecs.harvard.edu/~salil
Outline I. Motivation II. Extractors as extractors III. Extractors as hash functions IV. Extractors as expander graphs V. Extractors as pseudorandom generators VI. Extractors as error-correcting codes VII. Concluding remarks & open problems
Weak Random Sources • What is a source of biased & correlated bits? • Probability distribution X on {0,1}n. • Must contain some “randomness”. • Want: no independence assumptions )one sample • Measure of “randomness” • Shannon entropy: No good: • Better [Zuckerman `90]: min-entropy
Min-entropy • Def: X is a k-source if H1(X)¸k. i.e. Pr[X=x]·2-k for all x • Examples: • Unpredictable Source [SV84]: 8i2[n], b1, ..., bi-12{0,1}, • Bit-fixing [CGH+85,BL85,LLS87,CW89]: Some k coordinates of X uniform, rest fixed (or even depend arbitrarily on others). • Flat k-source: Uniform over Sµ{0,1}n, |S|=2k • Fact [CG85]: every k-source is convex combination of flat ones.
Extractors: 1st attempt • A function Ext : {0,1}n!{0,1}m s.t. 8k-source X, Ext(X) is “close” to uniform. k-source of length n EXT malmost-uniform bits • Impossible! 9 set of 2n-1 inputs x on which first bit of Ext(x) is constant )flat (n-1)-sourceX, bad for Ext.
Extractors [Nisan & Zuckerman `93] • Def: A (k,e)-extractor is Ext : {0,1}n£{0,1}d!{0,1}m s.t. 8k-source X, Ext(X,Ud) is e-close to Um. k-source of length n “seed” EXT drandom bits malmost-uniform bits • Key point: seed can be much shorter than output. • Goals: minimize seed length, maximize output length.
Definitional Details • Ut = uniform distribution on {0,1}t • Measure of closeness: statistical difference (a.k.a. variation distance) • T = “statistical test” or “distinguisher” • metric, 2[0,1], very well-behaved • Def: X, Ye-close if D(X,Y)·e.
The Parameters • The min-entropyk: • High min-entropy: k = n-a, a =o(n) • Constant entropy rate: k = W(n) • Middle (hardest) range: k = na, 0<a<1 • Low min-entropy: k = no(1) • The errore: • In this talk: e=.01 (for simplicity) • Very small e sometimes important. • The output lengthm: • Certainly m·k+d. • Can this be achieved?
Pf sketch: Probabilistic Method. Show that for random Ext, Pr[Ext not (k,e)-extractor] < 1. • Use capital letters: N=2n, M=2m, ... • For fixed flatk-sourceX and Tµ{0,1}m, • # choices of X and T = (Chernoff) The Optimal Extractor Thm [Sip88,RT97]:For every k·n, 9 a (k,e)-extractor w/ • Seed length d= log(n-k)+O(1) • Output length m = k+d -O(1) • “extract almost all the min-entropy w/logarithmic seed” (¼ log nexcept high min-ent.)
The Optimal Extractor • Thm: For every k·n, there exists a (k,e)-extractor w/ • Seed length d= log(n-k)+O(1) • Output length m = k+d-O(1) • Thm [NZ93,RT97]:Above tight up to additive constants. • For applications, need explicitextractors: • Ext(x,y) computable in time poly(n). • Random extractor requires space ¸2n to even store! • Long line of research has sought to approach above bounds with explicit constructions.
EXT d-bit seed almost +e 2( ) • Run algorithm using all 2d seeds & output majority. • Only polynomial slowdown, provided d=O(log n)andExt explicit. Application: BPP w/a weak source[Zuckerman `90,`91] k-source muniform bits Randomized Algorithm input x accept/reject errs w.p. ·d
Strong extractors • Output looks random even after seeing the seed. • Def: Ext is a (k,e)strong extractor if Ext0(x,y)=y±Ext(x,y) is a (k,e) extractor • i.e. 8k-sources X, for a 1-e0 frac. of y2{0,1}d Ext(X,y) is e0-close to Um • Optimal:d= log(n-k)+O(1), m= k-O(1) • Can obtain strongness explicitly at little cost [RSW00].
flat k-source, i.e. set of size 2k À 2m • For mosty, hymaps sets of sizeKalmost uniformly onto range. Extractors as Hash Functions {0,1}n {0,1}m
Extractors from Hash Functions • Leftover Hash Lemma[ILL89]: universal (ie pairwise independent) hash functions yield strong extractors • output length:m= k-O(1) • seed length:d= O(n) • example: Ext(x,(a,b))=first m bits of a¢x+b in GF(2n) • Almost pairwise independence [SZ94,GW94]: • seed length:d= O(log n+k)
Composing Extractors • We have some nontrivial basic extractors. • Idea: compose them to get better extractors • Original approach of [NZ93] & still in use.
EXT2 d2bits m2-bit output Increasing the Output [WZ93] k-source EXT1 d1bits m1-bit output • Intuition: if m1¿k, source still has a lot of randomness
Increasing the Output Length [WZ93] • Prop:If Ext1 is a(k,e)-extractor & Ext2 a (k-m1-O(1),e)-extractor, thenExt is a (k,3e)-extractor. • Key lemma: (X,Z) (correlated) random vars. w.p. ¸ 1-e over zÃZ, X|Z=z is a (k-s-log(1/e))-source. X a k-source and |Z|=s • Compare w/Shannon entropy:
Proof of Key Lemma • Key lemma: (X,Z) (correlated) random vars, • Proof: LetBAD = { z : Pr[Z=z] ·e¢ 2-s}. Then w.p. ¸ 1-e over zÃZ, X|Z=z is a (k-s-log(1/e))-source. X a k-source and |Z|=s
Increasing the Output [WZ93] k-source X EXT1 d1bits EXT2 d2bits m1-bit Z1 m2-bit Z2 • Pf of Prop: • Z1 e-close to uniform (because Ext1 an extractor) • w.p. ¸ 1-e over zÃZ1 • X|Z1=z a (k1-m1-O(1))-source (by Key Lemma) • Z2 |Z1=z e-close to uniform(because Ext2 an extractor) • )(Z1,Z2)3e-close to uniform.
length n EXT seed 0100000100010101100000010 An Application [NZ93]:Pseudorandom bits vs. Small Space ) Output looks uniform to observer. 0100000100010101100000010 00111011101000100000000100001100001 • Start w/source of truly random bits. Small space s • Conditioned on observer’s state, have (k-s-O(1))-source w.h.p. (by Key Lemma) 00000000 01100001 • Applications: • derandomizing logspace algorithms [NZ93] • cryptography in bounded-storage model [Lu02]
zzzz EXT2 d2bits Shortening the Seed • Idea: use output of one extractor as seed to another. • Ext2may have shorter seed (due to shorter output). • Problem: Ext1 only guaranteed to work when seed independent of source. k-source EXT1 d1bits m1-bit output
X2 EXT2 d2bits Block Sources [CG85] • Q: When does this work? • Def: (X1,X2) is a (k1,k2) block source if • X1 is ak1-source • is a k2-source X1 EXT1 d1bits m1-bit output
The [NZ93] Paradigm An approach to constructing extractors: • Given a general source X • Convert it to a block source (X1,X2) • can use part of the seed for this • may want many blocks (X1,X2 ,X3,...) • Apply block extraction (using known extractors, e.g. almost pairwise independence) • Still useful today – it “improves” extractors, e.g. [RSW00] • How to do Step 2?? • get a block by randomly sampling bits from source... • harder as min-entropy gets lower.
Outline I. Motivation II. Extractors as extractors III. Extractors as hash functions IV. Extractors as expander graphs V. Extractors as pseudorandom generators VI. Extractors as error-correcting codes VII. Concluding remarks & open problems
Neighbors(S) S Expander Graphs • Informally: Sparse graphs w/ very strong connectivity. • Measures of Expansion: • Vertex Expansion: Every subset S of size n has at least |S|neighbors for constants > 0, > 1. • Eigenvalues: 2nd largest eigenvalue of random walk on G is for constant < 1. (equivalent for constant-degree graphs [Tan84,AM85,Alo86]) • Goals: • Minimize the degree. • Maximize the expansion. • Random graphs of degree 3 are expanders [Pin73], but explicit constructions of constant-degree expanders much harder [Mar73,...,LPS86,Mar88].
K ¸ (1-) M Extractors & Expansion [NZ93] [N] ´{0,1}n x n-bit k-source • Connect x{0,1}n and y{0,1}m if Ext(x,r)=yfor some r {0,1}d • Short seed low degree • Extraction expansion D EXT d-bit seed [M] ´{0,1}m malmost-uniform bits y
Extractors vs. Expander Graphs Main Differences: • Extractors are unbalanced, bipartite graphs. • Different expansion measures (extraction vs. e-value). • Extractors graphs which “beat the e-value bound” [NZ93,WZ93] • Extractors polylog degree, expanders constant degree. • Extractors:expansion for sets of a fixed sizeExpanders: expansion for all sets up to some size
Extractors vs. Expander Graphs Main Differences: • Extractors are unbalanced, bipartite graphs. • Different expansion measures (extraction vs. e-value). • Extractors graphs which “beat the e-value bound” [NZ93,WZ95] • Extractors polylog degree, expanders constant degree. • Extractors expansion for sets of a fixed sizeExpanders expansion for all sets up to some size
K ¸ (1-) M Expansion Measures — Extraction [N] ´{0,1}n n-bit k-source • Extractors: Start w/min-entropy k, end -close to min-entropy m) measures how much min-entropy increases (or is not lost) • Eigenvalue: similar, but for “2-entropy” (w/o -close) D EXT d-bit seed [M] ´{0,1}m malmost-uniform bits
Expansion Measures — The Eigenvalue • Let G = D-regular, N-vertex graphA = transition matrix of random walk on G = (adj. mx)/D • Fact: A has 2nd largest e-value iff prob. distribution X|| A X - UN||2 || X - UN||2 • Fact: • e-value measures how random step increases 2-entropy
Extractors vs. Expander Graphs Main Differences: • Extractors are unbalanced, bipartite graphs. • Different expansion measures (extraction vs. e-value). • Extractors graphs which “beat the e-value bound” [NZ93,WZ95] • Extractors polylog degree, expanders constant degree. • Extractors: expansion for sets of a fixed sizeExpanders: expansion for all sets up to some size
The Degree • Constant-degree expanders viewed as “difficult”. • Extractors typically nonconstant degree, “elementary” • Optimal: d log (n-k)truly random bits. • Typically: k = nor k = n d=(log n) • Lower min-entropies viewed as hardest. • Contradictory views? • Easiest extractors highest min-entropy k = n–O(1) d=O(1) constant degree • Resolved in [RVW01]: high min-entropy extractors & constant-degree expanders from same, simple “zig-zag product” construction.
n2 n1 EXT2 d2 EXT1 d1 • Observe: If break source into two blocks. • ) (close to) a (n1-a, n2-a-O(1))-block source! • (by Key Lemma) • Do block-source extraction! m1 High Min-Entropy Extractors [GW94] length n, (n-a)-source
n2 n1 EXT2 d2 d3 EXT1 d1 a a • Solution: • Collect “buffers” which retainunextracted min-entropy [RR99] EXT3 • Extract from buffers at end. m1 a zig-zag product Zig-Zag Product [RVW00] length n, min-entropy n-a Problem: Lose a bitsof min-entropy.
Extractors vs. Expander Graphs Main Differences: • Extractors are unbalanced, bipartite graphs. • Different expansion measures (extraction vs. e-value). • Extractors graphs which “beat the e-value bound” [NZ93,WZ95] • Extractors polylog degree, expanders constant degree. • Extractors: expansion for sets of a fixed sizeExpanders: expansion for all sets up to some size
Randomness Conductors [CRVW02] n-bit input • Six parameters: n, m, d, k, , • For every k k and every input k-source, output is -close to a (k+)-source. • m =k+ : extractor with guarantee for smaller sets. • =d : “Lossless expander” [TUZ01] • Equivalent to graphs with vertex expansion (1-)degree! • Explicitly: very unbalanced case w/polylog degree [TUZ01], nearly balanced case w/const. deg [CRVW02] CON d-bit seed m-bit output
Pseudorandom Generators [BM82,Y82,NW88] • Generate many bits that “look random” from short random seed. d-bit seed PRG mbits indistinguishablefr. uniform • Distributions X, Y computationally indistinguishableif for all efficientT(circuit of size m),
Hardness vs. Randomness • Any function of high circuit complexity PRGs[BM82,Y82,NW88,BFNW93,IW97,...] • Current state-of-the-art [SU01,Uma02]: • Thm [IW97]: If E=DTIME(2O()) requires circuits of size 2(), then P=BPP. f : {0,1}{0,1}circuit complexity k PRGf : {0,1}O(){0,1}mm k(1)
Extractors & PRGs Thm [Trevisan `99]: Any “sufficiently general” construction of PRGs from hard functions is also an extractor.
Extractors & PRGs n bits w/min-entropyk d-bit seed PRGf EXT d-bit seed comp. indisting.from Um statistically closeto Um