1 / 27

An FPTAS for #Knapsack and Related Counting Problems

This paper explores counting problems and their approximate solutions, focusing on the knapsack problem. It also discusses the necessity of randomness in counting, the dependence on epsilon, and deterministic approximation algorithms. The paper introduces a dynamic programming algorithm for counting knapsack solutions, as well as ways to deal with additional constraints and generalize to distributions given by small space sources. Other problems, such as contingency tables, are also discussed.

vfagan
Download Presentation

An FPTAS for #Knapsack and Related Counting Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An FPTAS for #Knapsack and Related Counting Problems Parikshit Gopalan Adam Klivans Raghu Meka Daniel Štefankovič Santosh Vempala Eric Vigoda

  2. An FPTAS for #Knapsack and Related Counting Problems Parikshit Gopalan, Adam Klivans, Raghu Meka Daniel Štefankovič, Santosh Vempala, Eric Vigoda

  3. What can be counted? (in polynomial-time) exactly? very little... number of spanning trees (using determinant), Kirchoff’1847. perfect matchings in planar graphs (using Pfaffians), Kasteleyn’1960. (rest: usually #P-hard)

  4. What can be counted? (in polynomial-time) approximately? a little more... perfect matchings in bipartite graphs (permanent of non-negative matrices), Jerrum, Sinclair, Vigoda’2001. Ferromagnetic Ising model, Jerrum, Sinclair’1989. Independent sets (  5), Weitz’2004. k-colorings (k  (11/6)), Vigoda’1999. .... (approximate counting  random sampling, Jerrum, Valiant, Vazirani’1986)

  5. Approximate counting (in polynomial-time) deterministic: OUT Q 1-   1+ INPUT OUT  randomized: INPUT OUT  P( )1- OUT Q 1-   1+

  6. not too many examples: independent sets in degree 5 graphs (Weitz’2004), matchings in bounded degree graphs (Bayati, Gamarnik, Katz, Nair, Tetali’2007), satisfying assignments of DNF formulas with terms of size  C (Ajtai, Wigderson’1985) more examples; Monte Carlo, usually using a Markov chain (dependence 1/2)

  7. 1) is randomness necessary ? Is P = BPP ? Primes  P (Agarwal, Kayal, Saxena 2001) 2) dependence on  ? Monte Carlo (1/2)

  8. Knapsack (optimization) max  vi i  S  wi L i  S weights INPUT: (w1,v1),...(wn,vn), L OUTPUT: (integers) values S [n]

  9. Dynamic program #1 (L is small) T[i,M] (optimal solution with items 1,...,i and limit M) T[i-1,M] { T[i,M] = max T[i-1,M-wi] + vi

  10. Dynamic program #2 (vi’s are small) T[i,V] (smallest weight of a subset of 1,...,i, with value  V) T[i-1,V] { T[i,V] = min T[i-1,V-vi] + wi approximation algorithm

  11. Counting knapsack  wi L i  S INPUT: w1,...,wn, L OUTPUT: S [n] How many with are there? #P-hard

  12. Counting knapsack Dyer, Frieze, Kannan, Kapoor, Perkovic, Vazirani’1993 exp(O*(n1/2)) / 2 randomized approximation algorithm Morris, Sinclair’1999 O( nc / 2 ) randomized approximation algorithm (MCMC, canonical paths) Dyer’2003 O(n2.5 + n2/2) randomized approximation algorithm (dynamic programming) OURS: O*(n3/)

  13. Dyer’2003: T[i,M] (number of solutions with items 1,...,i and limit M) T[i,M] = T[i-1,M] + T[i-1,M-wi] + rejection sampling approximate counter

  14. + rejection sampling approximate counter n2 wi wi’ = L’ = n2 L rounding: wi’’ =  wi’  • get more solutions, ’’  ’ • not too many more, |’’| (n+1)|’| Proof: S’’’’ - ’, X heaviest in S’’, then S’’ - {X}’

  15. Our dynamic program deterministic approximation algorithm smallest M such that knapsack with w1,...,wi,M has  A solutions (i,A) = (i-1, A) { (i,A) = min max [0,1] (i-1,(1-) A)+wi

  16. Q = 1+ /(n+1) s =  n logQ 2 T[0..n,0..s] T(i-1,j+lnQ) { T(i,j)=min max T(i-1,j+lnQ(1-))+wi [0,1] Lemma 1: (i,Qj-i)  T[i,j]  (i,Qj)

  17. T(i-1,j+lnQ) { T(i,j)=min max T(i-1,j+lnQ(1-))+wi [0,1] Lemma 2: can compute recursion efficiently only few values of  matter Q-j,....,Q0, 1-Q0, .... , 1-Qj can use binary search n3 TOTAL RUN TIME = O( log(n/)) 

  18. How to deal with more constraints ?  wj,i Lj i  S (e.g., contingency tables, multi-dimensional knapsack, ...) multi-dimensional knapsack: S [n] How many with j{1,...,k} are there? O( (n/)O(k2) log W) algorithm

  19. Read once branching programs • Layered directed graph • vertices per layer • Edges between consecutive layers • Edges labeled • Input: • Output: Label of final vertex reached n layers Counting the number of accepting paths ? dynamic programming, time = O(nS)

  20. ROBP for knapsack n layers Problem: width too large Solution: reduce width by approximating

  21. Monotone ROBPs accepting paths from u u  v  A(u)  A(v) monotone: u given implicitly • ordering: given u,v, is u  v ? • midpoint: given u,v, get w s.t. • |{x;uxw}| = |{x;wxv}|  1 • transitions: given u, get the • outneighbors of u v

  22. group the vertices in the layers according to the rough number of accepting paths processing right-left already “shrunk”

  23. More constraints? can be generalized to distributions given by small space sources. small space sources = ROBP + probability distributions on outgoing edges p1 1-p1 n layers

  24. More constraints? n layers n layers can be combined to get (S2,n)-ROBP for intersection additive approximation preserved

  25. uniform distribution given by ’’ can be given • by small space source • 2) additive approximation  • multiplicative approximation

  26. Other problems: contingency tables with constant number of rows What other problems are solvable using the technique? Thanks!

More Related