1 / 38

The Poincar é Constant of a Random Walk in High-Dimensional Convex Bodies

The Poincar é Constant of a Random Walk in High-Dimensional Convex Bodies. Ivona Bez áková Thesis Advisor: Prof. Eric Vigoda. Goal: Efficient algorithm for sampling points from high-dimensional convex bodies. Approach: Random walks. Overview of the talk. Motivation & History

Download Presentation

The Poincar é Constant of a Random Walk in High-Dimensional Convex Bodies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Poincaré Constant ofa Random Walk in High-Dimensional Convex Bodies Ivona Bezáková Thesis Advisor: Prof. Eric Vigoda

  2. Goal: Efficient algorithm for sampling points from high-dimensional convex bodies Approach: Random walks

  3. Overview of the talk • Motivation & History • Introduction to random walks, Markov chains • Definition and basic properties of ball walks • Overview of the algorithm for uniform sampling • Analysis of the Poincaré constant

  4. Motivation • Sampling points from convex bodies efficiently • Computation of volume • Sampling contingency tables [Morris] • Universal portfolios [Kalai & Vempala] • Convex programs [Bertsimas & Vempala] Why is computing volume difficult? • Straightforward approach: • find bounding box • sample sufficiently many points from the box • compute ratio: # points in the body vs. total # points

  5. This algorithm correctly approximates volume. Where is the catch? Ratio of the volume of the body vs. volume of the box decreases exponentially with dimension! This results in exponential running time. Note: For small dimension (2D, 3D) this algorithms works well. Goal: Find algorithm running in time polynomial in dimension.

  6. How efficient sampling helps with volume? • intersect with balls doubling in volume • this defines sequence of convex bodies • sample points from i-th body, compute ratio: i-th vs. (i-1)-st body • return product of ratios Result: volume of the original body vs. volume of the smallest body (a ball) Why is this better than the bounding box? - Volume of bodies max. doubles

  7. History • negative results for volume computation: • #P-complete • cannot be approximated by deterministic polytime algorithm [Elekes, Bárány & Füredi, Dyer & Frieze, Khachiyan, Lawrence, Lovász & Simonovits] • randomized approximation: • Dyer, Frieze, Kannan ’89 • improvements: combinations of [Applegate, Dyer, Frieze, Kannan, Lovász, Simonovits] • Kannan, Lovász, Simonovits ‘97

  8. Notation • convex body of diameter • (given by membership oracle) unit-ball • step-size • ball

  9. Ball Walks Speedy Walk– next point Problem: How to implement? Metropolis Walk – next point if then

  10. Markov Chains State space Transition distribution (likelihood of going from to ) For speedy walk we have state space = convex body if otherwise

  11. Markov Chains 2 Stationary distribution = limiting distribution (fix-point) i.e. Mixing time For given mixing time is the expected number of steps needed to get close to the stationary distribution. Want: rapid mixing, i.e. time polynomial in and

  12. Comparison KLS vs. this work Kannan, Lovász, Simonovits study so-called conductance for bounding mixing time. spectral gap ? poly-logarithmic in We bound so-called Poincaré constant (generalization of conductance) and get mixing time cubic in

  13. New ideas in KLS • separated analysis of speedy walk (fast mixing in principle) and Metropolis walk (efficient implementation) • for volume computation: introduced isotropic position to reduce diameter of the body our focus: survey • Why Poincaré constant? • generalization might lead to better analysis through other quantities (log-Sobolev, [Frieze & Kannan, Jerrum & Son]) • the same difficulty

  14. Well-studied Quantities Poincaré constant where measures decaying of variance and Dirichlet form (local variance)

  15. Well-studied Quantities 2 (Properties of Poincaré) For Markov chains defined on finite state spaces the Poincaré constant equals the spectral gap. with probability ½ stay at the same state corresponds to symmetric chains Thm: For (lazy reversible) Markov Chain where is the distribution after steps and is the stationary distribution

  16. Thoughts about Poincaré constant If then Thus, in this case and the chain mixes (very) rapidly. Intuitively, this corresponds to a complete graph, where we can get from any point to any other point.

  17. Well-studied Quantities 3 Conductance equals Poincaré over indicator functions trivially Cheeger-type inequality by Jerrum and Sinclair, ‘89

  18. Properties of Ball Walks Local conductance • Ball walks: • stationary distribution where • reversible

  19. From Speedy Walk to Uniform Sampling (Overview) • bound Poincaré constant for speedy walk • mixing time for speedy walk • running time of Metropolis walk (assuming good starting distribution) • obtain a good starting distribution • from a sample point from the speedy distribution obtain a sample point close to the uniform distribution

  20. From Speedy Walk to Uniform Sampling Poincaré inequality (for speedy walk): If then for some dimension-independent constant Mixing time of speedy walk: Thm: For (lazy reversible) Markov Chain For given distribution after steps within from speedy distribution where is the distribution after steps (assuming reasonable starting distribution ) and is the stationary distribution

  21. From Speedy Walk to Uniform Sampling (Overview) • bound Poincaré constant for speedy walk • mixing time for speedy walk • running time of Metropolis walk (assuming good starting distribution) • obtain a good starting distribution • from a sample point from the speedy distribution obtain a sample point close to the uniform distribution

  22. From Speedy Walk to Uniform Sampling 2 From speedy to Metropolis walk Run M. walk until speedy steps Mixing time of Metropolis walk: If then we expect the total number of steps (speedy + Metropolis) to be at most (with exception ) where is the average local conductance:

  23. From Speedy Walk to Uniform Sampling (Overview) • bound Poincaré constant for speedy walk • mixing time for speedy walk • running time of Metropolis walk (assuming good starting distribution) • obtain a good starting distribution • from a sample point from the speedy distribution obtain a sample point close to the uniform distribution

  24. From Speedy Walk to Uniform Sampling 3 Obtaining a good starting distribution Let and for where Algo: • Sample from according to • For obtain : • Run Metropolis in starting at

  25. From Speedy Walk to Uniform Sampling 4 Good starting distribution for Metropolis walk Thm: For sufficiently small and the distribution of is within of . Expected total number of oracle calls (with exception ) is less than

  26. From Speedy Walk to Uniform Sampling (Overview) • bound Poincaré constant for speedy walk • mixing time for speedy walk • running time of Metropolis walk (assuming good starting distribution) • obtain a good starting distribution • from a sample point from the speedy distribution obtain a sample point close to the uniform distribution

  27. From Speedy Walk to Uniform Sampling 5 From speedy distribution to the uniform distribution Algo: • Shrink • Sample from until • Return Thm: If and sufficiently small then the distribution of is away from the uniform distribution. Expected number of samples needed .

  28. From Speedy Walk to Uniform Sampling (Overview) • bound Poincaré constant for speedy walk • mixing time for speedy walk • running time of Metropolis walk (assuming good starting distribution) • obtain a good starting distribution • from a sample point from the speedy distribution obtain a sample point close to the uniform distribution

  29. Proof of the Poincaré Inequality Poincaré inequality (for speedy walk): If then for any function where for some dimension-independent constant Restricted variance, Dirichlet form, expected value

  30. Idea of the proof: • For a sufficiently small set such that does not vary much within • Assuming Poincaré does not hold, we find a set contradicting the above wlog Find needle-like s.t. and Chop to obtain desired set

  31. Needle-like Body Eliminate dimensions one by one (inductively) • assume has fat dimensions and while • projection of onto two fat dim. • there exists a point s.t. any line through cuts into appx. half • take hyperplane s.t. • at least one of these must be true or

  32. Shrinking Last Dimension Goal: find s.t. last dim. of is and where is a constant (dependent on ) How? Chop into ? Ideally But

  33. Assumption Idea: relate to where We get where and Next goal: bound

  34. Chopping of What do we need? • does not vary much within , i.e. for any let • width of is at most We will show that this chopping allows us to bound appropriately

  35. Properties of local conductance From Brunn-Minkowski Thm: • is concave over • is Lipschitz over : For any Implications for the • The width of increases, then it is (full width) and then it decreases • For sufficiently small the width of any is at least

  36. Now we can split into several sums and estimate them separately From Dinghas’ Thm: For the middle section is convex. Thus where is the number of slabs in the middle section What to do outside the middle section?

  37. In the left section, the increase exponentially This allows us to bound We obtain similar bounds for other parts of the sum, putting them together we get We wanted Thus we proved the Poincaré inequality.

  38. THANK YOU

More Related