540 likes | 643 Views
The Power of Randomness in Computation. 呂及人 中研院資訊所. PART I: Randomization. Random Sampling. Population: 20 million, voting yellow or red. Random Sample: 3,000. Polling. With probability >99% % in population = % in sample 5% independent of population size. Lesson.
E N D
The Power of Randomness in Computation 呂及人 中研院資訊所
Population: 20 million, voting yellow or red Random Sample: 3,000 Polling With probability >99% % in population = %in sample 5% independent of population size
Lesson • A small set of random samples gives a good picture of the whole population. • Allow sub-linear time algorithms! • More applications: • Volume estimation • Clustering • Machine learning, ...
Problem Alice: x n x = y ? Bob: y n Measure: communication complexity
i, xi First Attempt Alice: x n x = y ? Bob: y n ir{1..n} xi = yi ? Works only when (x,y) is large
i, C(x)i ir{1..m} C(x)i =C(y)i ? Solution C:error-correctingcode with 1 Alice: x n Bob: y n xC(x) yC(y) x=y: Probi[C(x)iC(y)i] = 0 xy: Probi[C(x)i=C(y)i] 0 can repeat several times to reduce error
Lesson • Transform the data, before random sampling!
Dimensionality Reduction • Raw data A{0,1}n, for very large n. • e.g. images, voices, DNA sequences, ... • |A| << 2n. • Goal: • compressing each element of A, while keeping its “essence”
Classical Proof Systems • Prover: provides the proof. • Hard. • Verifier: verifies the proof. • Relatively easy! • Still needs to read through the proof. • What if you, the reviewer, receive a paper of 300 pages to verify...
Probabilistically Correct Proof (PCP) Verifier: • flips “some” random coins • reads only a “small” parts of the proof • tolerates a “small” error
Proof? • A format of arguments agreed upon by Prover and Verifier • soundness & completeness. • Choosing a good proof format Fast & simple verification!
Probabilistically Correct Proof (PCP) Prover: • transforms the proof by encoding it with some error correcting (testing) code!
PCP for NP • NP = PCP (O(logn), 3). • NP contains SAT, TSP, ..., and MATH = {(S,1t) : ZFC |=S in t steps}.
Isomorphic? G1 G2
Isomorphic! G1 G2
Problem • Input: two graphs G1 and G2 • Output: yes iff G1 and G2 are not isomorphic. • G1iso.G2 short proof (GNSIO co-NP) • G1not iso.G2 short proof ???
Randomized Algorithm • Verifier: • Picks a random i {1,2} • Sends G, a random permutation of Gi • Prover: • Sends j {1,2} • Verifier: • Outputs “non-isomorphic” iff i = j.
New Features • Non-transferable proofs • Zero-knowledge proofs • IP=PSACE “a lot more can be proved efficiently”
Problem • Input: undirected graph G and two nodes s, t • Output yes iff s is connected to t in G • Complexity: poly(n) time! • Question: O(logn)space? number of nodes
Randomized Algorithm • Take a random walk a length poly(n) from s. • Output yes iff t is visited during the walk. • Complexity: randomized O(logn) space • only need to remember the current node
Lesson • Interesting probabilistic phenomenon behind: • Mixing rate of Markov chain (related to resistance of electrical networks)
Problem • Input: a number x • Output: yes iff x is a prime • Important in cryptography, ...
Randomized Algorithm • Generate a random r {1, ..., x} • Output yes iff • GCD (x, r) = 1 & • [r/x] r(x-1)/2 (mod x) Jacobi symbol
Issues • Randomized algorithmMfor A: • Mhas access to perfectly random y • x, Proby[ M(x,y) A(x) ] < 0.000000001 • Issues? • Small probability of error. • Need perfectly randomy. How?
Solutions • Randomness extractors • Pseudo-random generators • Derandomization
EXT slightly random almost random short random seed: catalyst Setting Goal: short seed,long output
Applications • Complexity • Cryptography • Data structures • Distributed computing • Error-correcting codes • Combinatorics, graph theory • ...
Random? • Are coin tosses really random? • They “look random” to you, because you don’t have enough power (computation / measurement). • In many cases, “look random” is good enough!
PRG PRG pseudo-random random seed Goal: short seed,long output
Definition • G:{0,1}n{0,1}m, for n<m, is an -PRG against a complexity class C: predicate T C, | Probr[T(G(r)) = 1] Proby[T(y) = 1] | <.
PRG exists? • From an “average-casehard” function f: {0,1}n{0,1}, define PRGG: {0,1}n{0,1}n+1as G(r) = r。f(r)
PRG exists? • From an “worst-casehard” function f: {0,1}n{0,1}, define PRGG: {0,1}n{0,1}n+1as G(r) = r。f(r) • From a one-way function...
Pseudo-Randomness • Foundation of cryptography • Public-key encryption • zero-knowledge proofs, • secure function evaluation, ... Secret is there, but it looks random • More applications: learning theory, mathematics, physics, ...
Open Problems • Does randomness help poly-time / log-space / nondet. poly-time computation? BPP = P?BPL = L?BPNP = NP?
Open Problems • Is there a PRG with seed length O(logn) that fools poly-time / log-space / nondet. poly-time computation?
Derandomization • Rand. algorithm M for language A: Proby[ M(x,y) = A(x) ] > 0.99, x • Construct PRG G(fooling M)s.t. Probr[ M(x,G(r)) = A(x) ] > 0.5, x • To determine A(x),take majority vote ofM(x,G(r)) over all possible r.
Breakthroughs • Primality P: Agrawal-Kayal-Saxena 2002 • Undirected Reachability L: Reingold 2005
Still Open • Graph non-isomorphism in NP? (If two graphs are non-isomorphic, is there always a short proof for that?)