170 likes | 305 Views
Happy 60 th B’day Mike. Lower bounds, anyone? Avi Wigderson Institute for Advanced Study. Lower bounds & Randomness & Expanders. P = NP ?. Gradient descent. Neural network. Genetic algor. Dimension reduction. Occam’s razor. Removing noise. Generative grammar. Annealing.
E N D
Happy 60th B’day Mike
Lower bounds, anyone?Avi WigdersonInstitute for Advanced Study
Gradient descent Neural network Genetic algor Dimension reduction Occam’s razor Removing noise Generative grammar Annealing Low dim surface HMM Bayesian network Statistical mechanics Boosting SVD Sampling Decision tree Stock Market Essential parameters What is going on? Irregularities Regularities Clustering Correlations LHC Weather Internet Visual Neuro Genomic Seismic Language Translation Prediction Big DATA Astronomical
NP = coNP? Mike’s dictionary: Comput. Complexity Set Theory Polynomial ~ Countable Exponential ~ Uncountable NPcoNP PolysizeNondet DNF PolysizeNondet CNF Countable Nondet DNF Countable Nondet CNF Analytic coAnalytic [Sipser] New, “more combinatorial” proof Topological approach
P = NP ? PH = PSPACE ? [BGS] APANPA (diagonalization is useless) ? APHAPSPACEA ? Mike’s dictionary Oracle machines Circuit comp. Set theory PHA~ AC0 ~ Finite Borel hierarchy PSPACEA ~ NC1 ~ Borelsets [Sipser] New, “more combinatorial” proof [Furst-Saxe-Sipser,Ajtai] Parity AC0 [Yao, Hastad] APHAPSPACEA • • • Switching Lemma, Restrictions • Random
NL = L ? • Mike’s dictionary • Comp classes Finite automata • NL~ polysize 2NFA • L ~ polysize 2DFA • [Sipser] nlanguage Snsuch that • - Snis accepted by an O(n)-state 2NFA • Snrequires 2n-state (sweeping) 2DFA • REGULAR = 2DFA = 2NFA = 2PFA* • [Open] 2AMFA* = REGULAR ? • [CHPW] True if 2AMFA* = co2AMFA* *Polytime
Time vs. Space [HPV] Time(t)Space(t/log t) [Open] Time(t)Space(t.99) ? Randomness vs. Determinism [Open] BPP = P ? [Sipser] either Time(t)Space(t.99) orBPP =P Hardness vs. Randomness if Explicit extractors exist X
Utilizing Expanders [Sipser] Expanders T(t) S(t.99) or BPP = P [Karp-Pippenger-Sipser] Deterministic amplification [Sipser-Spielman] Expander codes ( [Gallager, Tanner] ) [Spielman] linear time encoding and decoding good codes [Sipser?] Affineexpander? [Klawe]Impossibility!
Hashing in Comput. Complexity [Sipser] BPPPH [Gacs, Lautemann] [Goldwasser-Sipser] PublicCoinIP = PrivateCoinsIP
Randomness & Lower bounds Probabilistic method (AC0) Natural proofs
- Can sequential computation be parallelized? - Are formulas weaker than circuits? Composition g:{0,1}m{0,1} f:{0,1}n{0,1} gof:{0,1}mn{0,1} D(gof) ≤ D(g)+D(f), L(gof) ≤ L(g) L(f) [Karchmer-Raz-Wigderson Conj] This is tight! NC1 vs. P gof g f f
[KRWconj]: D(gof) ≈ D(g)+D(f) [KRW]: Conjecture implies P ≠ NC1. [KRW]: Conjecture holds for monotone circuits [Cor]:mP ≠ mNC1. [Grigni-Sipser]:mL≠ mNC1. Natural proof Barrier doesn’t Seem to apply The KRW conjecture
Universal Relations: g ≤ Um, f ≤ Un, gof < UmoUn [EIRS, HW]: D(Um o Un) ≈ D(Um) + D(Un) [GMWW’13]: g D(go Un) ≈ D(g) + D(Un) [Open]: g,f D(g o f) ≈ D(g) + D(f) [Open]: fD(Umo f) ≈ D(Um) + D(f) KRW program