220 likes | 236 Views
Condensers. Expander Graphs. Universal Hash Functions. Randomness Extractors. Randomness Conductors. N. x ’. Randomness Conductors Meta-Definition. An R-conductor if for every (k,k’) R , X has k bits of “entropy” X’ has k’ bits of “entropy”. M. Prob. dist. X.
E N D
Condensers Expander Graphs Universal Hash Functions . . . . . . . . . . . . Randomness Extractors Randomness Conductors
N x’ Randomness Conductors Meta-Definition An R-conductor if for every (k,k’) R, X has k bits of “entropy” X’ has k’ bits of “entropy”. M Prob. dist. X Prob. dist. X’ D x
Plan Definitions & Applications: • The balanced case (M = N). • Vertex Expansion. • 2nd Eigenvalue Expansion. • The unbalanced case (M ≪ N). • Extractors, Dispersers, Condensers. Conductors • Universal Hash Functions. Constructions: • Zigzag Product & Loosless Expanders.
N N S, |S| K |(S)| A |S| (A > 1) D (Bipartite) Expander Graphs Important: every (not too large) set expands.
N N S, |S| K |(S)| A |S| (A > 1) D (Bipartite) Expander Graphs • Main goal: minimize D(i.e. constant D) • Degree 3 random graphs are expanders! [Pin73]
N N S, |S| K |(S)| A |S| (A > 1) D (Bipartite) Expander Graphs Also: maximize A. • Trivial upper bound: A D • even A ≲ D-1 • Random graphs: AD-1
Applications of Expanders These “innocent” objects are intimately related to various fundamental problems: • Network design (fault tolerance), • Sorting networks, • Complexity and proof theory, • Derandomization, • Error correcting codes, • Cryptography, • Ramsey theory • And more ...
Non-blocking Network with On-line Path Selection [ALM] N (Inputs) N (Outputs) Depth O(log N), size O(N log N), bounded degree. Allows connection between input nodes and output nodes using vertex disjoint paths.
Non-blocking Network with On-line Path Selection [ALM] N (Inputs) N (Outputs) • Every request for connection (or disconnection) is satisfied in O(log N) bit steps: • On line. Handles many requests in parallel.
“Lossless” Expander The Network N (Inputs) N (Inputs)
N M= N D S, |S| K |(S)| 0.9 D |S| 0< 1 is an arbitrary constant D is constant & K= (M/D) = (N/D). Slightly Unbalanced, “Lossless” Expanders [CRVW 02]: such expanders (with D = polylog(1/))
Unique neighbor of S Non Unique neighbor Property 1: A Very Strong Unique Neighbor Property S, |S| K, |(S)| 0.9 D |S| S S has 0.8 D |S| unique neighbors !
S` Step I: match S to its unique neighbors. Continue recursively with unmatched vertices S’. Using Unique Neighbors for Distributed Routing Task: match S to its neighbors (|S| K) S
Adding new paths: think of vertices used by previous paths as faulty. Reminder: The Network
Remains a lossless expander even if adversary removes (0.7 D) edges from each vertex. Property 2: Incredibly Fault Tolerant S, |S| K, |(S)| 0.9 D |S|
+ + 1 0 0 + 0 1 + 1 Simple Expander Codes [G63,Z71,ZP76,T81,SS96] N (Variables) M= N (Parity Checks) Linear code; Rate 1 – M/N= (1 - ). Minimum distanceK. Relative distanceK/N= ( / D) = / polylog (1/). For small beats the Zyablov bound and is quite close to the Gilbert-Varshamov bound of / log (1/).
+ + 1 Error set B, |B| K/2 0 |(B)| > .9 D |B| 0 1 + |(B)Sat|< .2 D|B| 1 0 + 0 1 1 1 0 0 Simple Decoding Algorithm in Linear Time (& log n parallel phases) [SS 96] N (Variables) M= N (Constraints) • Algorithm: At each phase, flip every variable that “sees” a majority of 1’s (i.e, unsatisfied constraints). |Flip\B| |B|/4|B\Flip| |B|/4 |Bnew||B|/2
x1 x2 xi Random Walk on Expanders [AKS 87] xi converges to uniform fast (for arbitrary x0). For a random x0: the sequencex0, x1, x2 . . . has interesting “random-like” properties. ... x0
N x’ Expanders Add Entropy M • Definition we gave: |Support(X’)| A |Support(X)| • Applications of the random walk rely on “less naïve” measures of entropy. • Almost all explicit constructions directly give “2nd eigenvalue expansion”. • Can be interpreted in terms of Renyi entropy. Prob. dist. X Induced dist. X’ D x
Symmetric N N D D 2nd Eigenvalue Expansion G - Undirected • P=(Pi,j) - transition probabilities matrix: Pi,j= (# edges between i and j in G) / D • Goal: If [0,1]n is a (non-uniform) distribution on vertices of G, then P is “closer to uniform.” N
2nd Eigenvalue Expansion • 0 1 … N-1, eigenvalues of P. • 0 =1, Corresponding eigenvector v0=1N:P(Uniform)=Uniform • Second eigenvalue (in absolute value):=(G)=max{|1|,|N-1|} • G connected and non-bipartite <1 • is a good measure of the expansion of G [Tan84, AM84, Alo86]. Qualitatively: G is an expander (G) < β < 1
Randomness conductors: • As in extractors. • Allows the entire spectrum. Randomness Conductors • Expanders, extractors, condensers & hash functions are all functions, f : [N] [D] [M], that transform: X “of entropy” kX’ =f (X,Uniform) “of entropy” k’ • Many flavors: • Measure of entropy. • Balanced vs. unbalanced. • Lossless vs. lossy. • Lower vs. upper bound on k. • Is X’ close to uniform? • …