290 likes | 526 Views
A PTAS for Computing the Supremum of Gaussian Processes. Raghu Meka (IAS/DIMACS). Gaussian Processes (GPs). Jointly Gaussian variables : Any finite sum is Gaussian. Supremum of Gaussian Processes (GPs). Given want to study. Why Gaussian Processes?. Stochastic Processes
E N D
A PTAS for Computing the Supremum of Gaussian Processes Raghu Meka (IAS/DIMACS)
Gaussian Processes (GPs) • Jointly Gaussian variables : • Any finite sum is Gaussian
Supremum of Gaussian Processes (GPs) Given want to study
Why Gaussian Processes? Stochastic Processes Functional analysis Convex Geometry Machine Learning Many more!
Cover times of Graphs Aldous-Fill 94: Compute cover time deterministically? Fundamental graph parameter Eg: • KKLV00: approximation • Feige-Zeitouni’09: FPTAS for trees
Cover Times and GPs • Transfer to GPs • Compute supremum of GP Thm (Ding, Lee, Peres 10): O(1) det. poly. time approximation for cover time. Thm (DLP10): Winkler-Zuckerman “blanket-time” conjectures.
Computing the Supremum Question (Lee10, Ding11): PTAS for computing the supremum of GPs? Question (Lee10, Ding11): Given , compute a factor approx. to • Covariance matrix • More intuitive Random Gaussian
Computing the Supremum Question (Lee10, Ding11): Given , compute a factor approx. to • DLP10: O(1) factor approximation • Can’t beat O(1): Talagrand’smajorizing measures
Main Result Thm: A PTAS for computing the supremum of Gaussian processes. Thm: PTAS for computing cover time of bounded degree graphs. Thm: Given , a det. algorithm to compute approx. to Comparison inequalities from convex geometry
Outline of Algorithm 1. Dimension reduction • Slepian’s Lemma, Johnson-Lindenstrauss 2. Optimal eps-nets in Gaussian space • Kanter’s lemma, univariate to multivariate
Dimension Reduction Idea: JL projection, solve in projected space Use deterministic JL – EIO02, S02. • , . V W
Analysis: Slepian’s Lemma Problem: Relate supremum of projections
Analysis: Slepian’s Lemma • Enough to solve for W • Enough to be exp. in dimension
Outline of Algorithm 1. Dimension reduction • Slepian’s Lemma, Johnson-Lindenstrauss 2. Optimal eps-nets in Gaussian space • Kanter’s lemma, univariate to multivariate
Nets in Gaussian Space • Goal: , in time approximate • We solve the problem for all semi-norms
Nets in Gaussian space • Discrete approximations of Gaussian Main thm: Explicit -net of size . Explicit • Integer rounding: (need granularity ) • Dadusch-Vempala’12: Optimal: Matching lowerbound
Construction of eps-net • Simplest possible: univariate to multivariate What resolution? Naïve: . How far out on the axes?
Construction of eps-net • Analyze ‘step-wise’ approximator Even out mass in interval . -
Construction of eps-net • Take univariate net and lift to multivariate Main Lemma: Can take What resolution enough? What resolution? Naïve: . How far out on the axes? -
Dimension Free Error Bounds Lem: For , a norm, • Proof by “sandwiching” • Exploit convexity critically -
Analysis of Error Def: Sym. (less peaked), if sym. convex sets K • Why interesting? For any norm,
Sandwiching and Lifting Nets Fact: Proof: Spreading away from origin! -
Sandwiching and Lifting Nets Fact: By definition, Cor: By Kanter’s lemma, Kanter’s Lemma(77): and unimodal, Cor: Upper bound,
Sandwiching and Lifting Nets Fact: Proof: For inward push compensates earlier spreading. • Def: scaled down version of • , , pdf of . Push mass towards origin.
Sandwiching and Lifting Nets Fact: By definition, Cor: By Kanter’s lemma, Kanter’s Lemma(77): and unimodal, Cor: Lower bound,
Sandwiching and Lifting Nets Combining both:
Outline of Algorithm 1. Dimension reduction • Slepian’s Lemma 2. Optimal eps-nets for Gaussians • Kanter’s lemma PTAS for Supremum
Open Problems • FPTAS for computing supremum? • Black-box algorithms? • JL step looks at points • PTAS for cover time on all graphs? • Conjecture of Ding, Lee, Peres 10