1 / 15

Sampling and counting

Understand how sampling perfect matchings in planar graphs leads to an FPRAS counting method with randomized approximation schemes and Markov chains. Learn about total variation distance, fully polynomial approximate samplers, and stationary distributions in Markov chains.

Download Presentation

Sampling and counting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. [Chapter 3] Sampling and counting How do we sample perfect matchings in a planar graph ? [Sampling: uniformly at random, i.e. each matching is equally likely.]

  2. [Section 3.1] Sampling and counting The process can be used reversely: using sampling to (approximately) count. Def: A randomized approximation scheme for a counting problem f is a randomized algorithm that, for a given input x and an error tolerance ²>0, outputs a value N s.t. Pr( (1-²)f(x) · N · (1+²)f(x) ) ¸ ¾ Fully polynomial randomized approximation scheme (FPRAS) is a randomized appx scheme that runs in time polynomial in |x| and 1/². Note: ¾ can be replaced by any constant in (½,1).

  3. [Section 3.2] Sampling and counting (Modified) Proposition 3.4: Let G be a graph with n vertices and m¸1 edges. If there is an algorithm with running time T(n,m) that produces a uniformly random perfect matching of G, then there is an FPRAS counting all perfect matchings in time cm2²-2T(n,m) for some constant c>0.

  4. [Section 3.2] Sampling and counting (Modified) Proposition 3.4: Let G be a graph with n vertices and m¸1 edges. If there is an algorithm with running time T(n,m) that produces a uniformly random perfect matching of G, then there is an FPRAS counting all perfect matchings in time cm2²-2T(n,m) for some constant c>0.

  5. [Section 3.2] Sampling and counting (Modified) Proposition 3.4: Let G be a graph with n vertices and m¸1 edges. If there is an algorithm with running time T(n,m) that produces a uniformly random perfect matching of G, then there is an FPRAS counting all perfect matchings in time cm2²-2T(n,m) for some constant c>0.

  6. [Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch

  7. [Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch

  8. [Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch

  9. [Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch

  10. [Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch

  11. [Section 3.1] Sampling and counting Original Prop. 3.4: about approximate (not exact) samplers. Def:Total variation distance of distributions ¼ and ¼’ on the same countable set : ||¼-¼’||TV := ½ !2 |¼(!)-¼’(!)| = maxA½ |¼(A)-¼’(A) | Def:Fully polynomial approximate sampler (FPAUS): given an input x and a sampling tolerance ±>0, the sampler produces an element from a distribution with total variation distance ·± from the target distribution.

  12. [Section 3.3] Markov chains • Def: A Markov chain is a sequence (Xt2)t=0,…,1 of random variables with state space , where Xt+1 depends only on Xt. • Example: MC for matchings: •  = ? • X0 = ; • for t¸0: choose a random edge e • if e in Xt, remove it, i.e. Xt+1 = Xt - {e} • if e not in Xt and the endpoints of e are not matched in Xt, then add it, i.e. Xt+1 = Xt [ {e} • otherwise, let Xt+1=Xt

  13. [Section 3.3] Markov chains Note: only discrete-time MC on a finite state space  Def: The transition matrix of a MC: an ||x|| matrix P where P(x,y) = Pr(Xt+1=y | Xt=x). Example: MC for matchings:

  14. [Section 3.3] Markov chains • Def: A stationary distribution¼:[0,1] satisfies • ¼(y) = x2¼(x)P(x,y) • An MC is • Irreducible if for every x,y there is a t s.t. Pt(x,y)>0 • Aperiodicif gcd{t: Pt(x,x)>0}=1 for every x2 • Ergodic if both irreducible and aperiodic • Thm 3.6: An ergodic MC has a unique stationary distribution ¼; moreover, Pt(x,y) converges to ¼(y) as t1.

  15. [Section 3.3] Markov chains Lemma 3.7: If ¼’ satisfies ¼’(x)P(x,y) = ¼’(y)P(y,x) for every x,y2, and x2¼’(x) = 1, Then ¼’ is a stationary distribution of the MC. Note: the condition from Lemma 3.7 is known as detailed balance (or reversibility of the MC).

More Related