150 likes | 165 Views
Understand how sampling perfect matchings in planar graphs leads to an FPRAS counting method with randomized approximation schemes and Markov chains. Learn about total variation distance, fully polynomial approximate samplers, and stationary distributions in Markov chains.
E N D
[Chapter 3] Sampling and counting How do we sample perfect matchings in a planar graph ? [Sampling: uniformly at random, i.e. each matching is equally likely.]
[Section 3.1] Sampling and counting The process can be used reversely: using sampling to (approximately) count. Def: A randomized approximation scheme for a counting problem f is a randomized algorithm that, for a given input x and an error tolerance ²>0, outputs a value N s.t. Pr( (1-²)f(x) · N · (1+²)f(x) ) ¸ ¾ Fully polynomial randomized approximation scheme (FPRAS) is a randomized appx scheme that runs in time polynomial in |x| and 1/². Note: ¾ can be replaced by any constant in (½,1).
[Section 3.2] Sampling and counting (Modified) Proposition 3.4: Let G be a graph with n vertices and m¸1 edges. If there is an algorithm with running time T(n,m) that produces a uniformly random perfect matching of G, then there is an FPRAS counting all perfect matchings in time cm2²-2T(n,m) for some constant c>0.
[Section 3.2] Sampling and counting (Modified) Proposition 3.4: Let G be a graph with n vertices and m¸1 edges. If there is an algorithm with running time T(n,m) that produces a uniformly random perfect matching of G, then there is an FPRAS counting all perfect matchings in time cm2²-2T(n,m) for some constant c>0.
[Section 3.2] Sampling and counting (Modified) Proposition 3.4: Let G be a graph with n vertices and m¸1 edges. If there is an algorithm with running time T(n,m) that produces a uniformly random perfect matching of G, then there is an FPRAS counting all perfect matchings in time cm2²-2T(n,m) for some constant c>0.
[Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch
[Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch
[Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch
[Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch
[Section 3.2] Sampling and counting Proof of Proposition 3.4: sketch
[Section 3.1] Sampling and counting Original Prop. 3.4: about approximate (not exact) samplers. Def:Total variation distance of distributions ¼ and ¼’ on the same countable set : ||¼-¼’||TV := ½ !2 |¼(!)-¼’(!)| = maxA½ |¼(A)-¼’(A) | Def:Fully polynomial approximate sampler (FPAUS): given an input x and a sampling tolerance ±>0, the sampler produces an element from a distribution with total variation distance ·± from the target distribution.
[Section 3.3] Markov chains • Def: A Markov chain is a sequence (Xt2)t=0,…,1 of random variables with state space , where Xt+1 depends only on Xt. • Example: MC for matchings: • = ? • X0 = ; • for t¸0: choose a random edge e • if e in Xt, remove it, i.e. Xt+1 = Xt - {e} • if e not in Xt and the endpoints of e are not matched in Xt, then add it, i.e. Xt+1 = Xt [ {e} • otherwise, let Xt+1=Xt
[Section 3.3] Markov chains Note: only discrete-time MC on a finite state space Def: The transition matrix of a MC: an ||x|| matrix P where P(x,y) = Pr(Xt+1=y | Xt=x). Example: MC for matchings:
[Section 3.3] Markov chains • Def: A stationary distribution¼:[0,1] satisfies • ¼(y) = x2¼(x)P(x,y) • An MC is • Irreducible if for every x,y there is a t s.t. Pt(x,y)>0 • Aperiodicif gcd{t: Pt(x,x)>0}=1 for every x2 • Ergodic if both irreducible and aperiodic • Thm 3.6: An ergodic MC has a unique stationary distribution ¼; moreover, Pt(x,y) converges to ¼(y) as t1.
[Section 3.3] Markov chains Lemma 3.7: If ¼’ satisfies ¼’(x)P(x,y) = ¼’(y)P(y,x) for every x,y2, and x2¼’(x) = 1, Then ¼’ is a stationary distribution of the MC. Note: the condition from Lemma 3.7 is known as detailed balance (or reversibility of the MC).