1 / 54

The Power of Randomness in Computation

The Power of Randomness in Computation. 呂及人 中研院資訊所. PART I: Randomization. Random Sampling. Population: 20 million, voting yellow or red. Random Sample: 3,000. Polling. With probability >99% % in population = % in sample  5% independent of population size. Lesson.

mikko
Download Presentation

The Power of Randomness in Computation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Power of Randomness in Computation 呂及人 中研院資訊所

  2. PART I:Randomization

  3. Random Sampling

  4. Population: 20 million, voting yellow or red Random Sample: 3,000 Polling With probability >99% % in population = %in sample 5% independent of population size

  5. Lesson • A small set of random samples gives a good picture of the whole population. • Allow sub-linear time algorithms! • More applications: • Volume estimation • Clustering • Machine learning, ...

  6. Fingerprints

  7. Problem Alice: x  n x = y ? Bob: y  n Measure: communication complexity

  8. i, xi First Attempt Alice: x  n x = y ? Bob: y  n ir{1..n} xi = yi ? Works only when (x,y) is large

  9. i, C(x)i ir{1..m} C(x)i =C(y)i ? Solution C:error-correctingcode with  1 Alice: x  n Bob: y  n xC(x) yC(y) x=y: Probi[C(x)iC(y)i] = 0 xy: Probi[C(x)i=C(y)i]  0 can repeat several times to reduce error

  10. Lesson • Transform the data, before random sampling!

  11. Dimensionality Reduction • Raw data A{0,1}n, for very large n. • e.g. images, voices, DNA sequences, ... • |A| << 2n. • Goal: • compressing each element of A, while keeping its “essence”

  12. Proof Systems

  13. Classical Proof Systems • Prover: provides the proof. • Hard. • Verifier: verifies the proof. • Relatively easy! • Still needs to read through the proof. • What if you, the reviewer, receive a paper of 300 pages to verify...

  14. Probabilistically Correct Proof (PCP) Verifier: • flips “some” random coins • reads only a “small” parts of the proof • tolerates a “small” error

  15. Proof? • A format of arguments agreed upon by Prover and Verifier • soundness & completeness. • Choosing a good proof format  Fast & simple verification!

  16. Probabilistically Correct Proof (PCP) Prover: • transforms the proof by encoding it with some error correcting (testing) code!

  17. PCP for NP • NP = PCP (O(logn), 3). • NP contains SAT, TSP, ..., and MATH = {(S,1t) : ZFC |=S in t steps}.

  18. Graph Non-Isomorphism

  19. Isomorphic? G1 G2

  20. Isomorphic! G1 G2

  21. Problem • Input: two graphs G1 and G2 • Output: yes iff G1 and G2 are not isomorphic. • G1iso.G2   short proof (GNSIO  co-NP) • G1not iso.G2   short proof ???

  22. Randomized Algorithm • Verifier: • Picks a random i {1,2} • Sends G, a random permutation of Gi • Prover: • Sends j {1,2} • Verifier: • Outputs “non-isomorphic” iff i = j.

  23. New Features • Non-transferable proofs • Zero-knowledge proofs • IP=PSACE “a lot more can be proved efficiently”

  24. Reachability

  25. Problem • Input: undirected graph G and two nodes s, t • Output yes iff s is connected to t in G • Complexity: poly(n) time! • Question: O(logn)space? number of nodes

  26. Randomized Algorithm • Take a random walk a length poly(n) from s. • Output yes iff t is visited during the walk. • Complexity: randomized O(logn) space • only need to remember the current node

  27. Lesson • Interesting probabilistic phenomenon behind: • Mixing rate of Markov chain (related to resistance of electrical networks)

  28. Primality Testing

  29. Problem • Input: a number x • Output: yes iff x is a prime • Important in cryptography, ...

  30. Randomized Algorithm • Generate a random r  {1, ..., x} • Output yes iff • GCD (x, r) = 1 & • [r/x]  r(x-1)/2 (mod x) Jacobi symbol

  31. PART II:Derandomization

  32. Issues • Randomized algorithmMfor A: • Mhas access to perfectly random y • x, Proby[ M(x,y)  A(x) ] < 0.000000001 • Issues? • Small probability of error. • Need perfectly randomy. How?

  33. Solutions • Randomness extractors • Pseudo-random generators • Derandomization

  34. Randomness Extractors

  35. EXT slightly random almost random short random seed: catalyst Setting Goal: short seed,long output

  36. Applications • Complexity • Cryptography • Data structures • Distributed computing • Error-correcting codes • Combinatorics, graph theory • ...

  37. Pseudo-Random Generators

  38. Random? • Are coin tosses really random? • They “look random” to you, because you don’t have enough power (computation / measurement). • In many cases, “look random” is good enough!

  39. PRG PRG pseudo-random random seed Goal: short seed,long output

  40. Definition • G:{0,1}n{0,1}m, for n<m, is an -PRG against a complexity class C: predicate T C, | Probr[T(G(r)) = 1] Proby[T(y) = 1] | <.

  41. PRG exists? • From an “average-casehard” function f: {0,1}n{0,1}, define PRGG: {0,1}n{0,1}n+1as G(r) = r。f(r)

  42. PRG exists? • From an “worst-casehard” function f: {0,1}n{0,1}, define PRGG: {0,1}n{0,1}n+1as G(r) = r。f(r) • From a one-way function...

  43. Pseudo-Randomness • Foundation of cryptography • Public-key encryption • zero-knowledge proofs, • secure function evaluation, ... Secret is there, but it looks random • More applications: learning theory, mathematics, physics, ...

  44. Derandomizatoin

  45. Open Problems • Does randomness help poly-time / log-space / nondet. poly-time computation? BPP = P?BPL = L?BPNP = NP?

  46. Open Problems • Is there a PRG with seed length O(logn) that fools poly-time / log-space / nondet. poly-time computation?

  47. Derandomization • Rand. algorithm M for language A: Proby[ M(x,y) = A(x) ] > 0.99,  x • Construct PRG G(fooling M)s.t. Probr[ M(x,G(r)) = A(x) ] > 0.5,  x • To determine A(x),take majority vote ofM(x,G(r)) over all possible r.

  48. Breakthroughs • Primality  P: Agrawal-Kayal-Saxena 2002 • Undirected Reachability  L: Reingold 2005

  49. Still Open • Graph non-isomorphism in NP? (If two graphs are non-isomorphic, is there always a short proof for that?)

  50. Conclusion

More Related