1 / 22

Linear Time Approximation Schemes for the Gale-Berlekamp Game and Related Minimization Problems

Linear Time Approximation Schemes for the Gale-Berlekamp Game and Related Minimization Problems. Marek Karpinski (Bonn) Warren Schudy (Brown) STOC 2009. Please see http://www.cs.brown.edu/~ws/papers/gb.pdf for the most current version of the paper. Gale-Berlekamp Game (1960s).

veta
Download Presentation

Linear Time Approximation Schemes for the Gale-Berlekamp Game and Related Minimization Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linear Time Approximation Schemesfor theGale-Berlekamp Gameand RelatedMinimization Problems Marek Karpinski (Bonn) Warren Schudy (Brown) STOC 2009 Please see http://www.cs.brown.edu/~ws/papers/gb.pdf for the most current version of the paper.

  2. Gale-Berlekamp Game (1960s) • Minimize number of lit light bulbs • NP hard [Roth & Viswanathan ’08] • PTAS runtime nO(1/ε²) [Bazgan, Fernandez de la Vega, & Karpinski ’03] • We give PTAS linear runtime O(n2)+2O(1/ε²) n/2 Animating…

  3. Dense MIN-UNCUT • “Approximate” 2-coloring • General case: • O(√ log n) approx is best known • no PTAS unless P=NP • [Everywhere-] dense case, i.e. every vertex has degree Ω(n) • Previous best PTAS: nO(1/ε²) [Arora, Karger, & Karpinski ’95] • We give PTAS with linear runtime O(n2)+2O(1/ε²) • If three colors no PTAS unless P=NP • Average degree Ω(n) is insufficient for PTAS unless P=NP Uncut (monochromatic) edge Added complete bipartite graph Animating…

  4. Generalization: Fragile dense MIN-k-CSP • n variables taking values from constant-sized domain • GB-Game: switches • MIN UNCUT: vertices • Soft constraints, which each depend on k variables • GB Game: lightbulbs • MIN UNCUT: edges • These constraints are fragile, i.e. changing value of a variable makes all satisfied constraints it participates in unsatisfied. (For all assignments.) • Dense, i.e. each variable appears in Ω(nk-1) constraints GB Game Dense MIN UNCUT First conceptual contribution: unifying these PTASs (and others) using new “fragile” framework • We give first PTAS for all fragile dense MIN-k-CSPs, which has linear runtime O(nk)+2O(1/ε²)

  5. Another fragile problem: Multiway cut Vertices are variables Edges are soft constraints These constraints are fragile, i.e. changing value of a variable makes all satisfied constraints it participates in unsatisfied • General case has O(1) approx. but no PTAS • Dense case: • Previous best PTAS: nO(1/ε²) [Arora, Karger, & Karpinski ’95] • We give PTAS with runtime O(n2)+2O(1/ε²) (linear-time) Animating…

  6. Summary of results Runtimes for 1+ε approximation on [everywhere-] dense instances: Reference key: • [AKK 95]=[Arora, Karger, & Karpinski ’95] • [BFK 03]=[Bazgan, Fernandez de la Vega, & Karpinski ’03] • [GG 06]=[Giotis & Guruswami ’06] Essentially optimal

  7. Additive error algorithms • Whenever OPT≥ f(ε)·nk we have f(ε)·ε·nk = O(ε·OPT), so existing algorithms achieving additive error f(ε)·ε·nk suffice for a PTAS. [Arora, Karger, & Karpinski ‘95, Fernandez de la Vega ‘96, Goldreich, Goldwasser & Ron ’98, Frieze & Kannan ’99, Alon, Fernandez de la Vega, Kannan, & Karpinski ’02, Mathieu & Schudy ’08] • Typical runtime: O(nk)+2O(1/ε²) • Rest of talk focuses on: • OPT small and • MIN-UNCUT

  8. Previous algorithm (1/3) – analysis version Assumes OPT ≤ εκ0 n2 where κ0 is a constant • Let S be random sample of V of size O(1/ε²)·log n • For each coloring x0 of S • partial coloring x2←if margin of v w.r.t. x0 is largethen color v greedily w.r.t. x0,else label v “ambiguous” • Extend x2 to a complete coloring x3 greedily • Return the best coloring x3 found Let x0 = x* restricted to S • Runtime: 2|S|= 2O(1/ε²)·log n = nO(1/ε²) Animating…

  9. Previous Algorithm (2/3) Blue 1 to 0 – margin is too small Blue 2 to 0 A • Define the margin of vertex v w.r.t. coloring x to be|(number of green neighbors of v in x) - (number of red neighbors of v in x)|. • Key facts: (recall dense assumption) • Partial coloring x2 agrees with the optimal coloring x* • There are few ambiguous vertices B A B A B Blue 1 to 0 – margin is too small Blue 1 to 0 – margin is too small D E D E D E C C C OPT F F F Blue 2 to 1 – margin is too small Sample x0 of OPT • partial coloring x2←if margin of v w.r.t. x0 is largethen color v greedily w.r.t. x0else label v “ambiguous” Blue 2 to 0 Animating…

  10. Previous algorithm (3/3) A B A B D E D E C C F F x2 x3 extends x2 greedily

  11. Previous algorithm Our Intermediate Assume OPT ≤ εκ0 n2 κ1 n2 κ2 • Let S be random sample of V of size O(1/ε²)·log n • For each coloring x0 of S • partial coloring x2←if margin of v w.r.t. x1 is largethen color v greedily w.r.t. x1else label v “ambiguous” • Extend x2 to a complete coloring x3 greedily • Return the best coloring x3 found Second conceptual contribution: two greedy phases before assigning ambiguity allows constant sample size • x1← greedy w.r.t. x0 Third conceptual contribution: use additive error algorithm to color ambiguous vertices. • using an algorithm with additive error at most Err=κ3 ε n · (# ambiguous) O(n2)+2O(1/ε4) O(n2)+2O(1/ε²) • Runtime: nO(1/ε²) Animating…

  12. More Algorithm (1/2) C is blue so I like being red E is red so I like being blue My reasoning exactly Me too A A A C C B D B D C D E B E E OPT F F C is Blue so I like being red F Sample x0 of OPT x1 is greedy w.r.t. (with respect to) x0 E is red so I’ll go blue

  13. More Algorithm (2/2) Blue 2 to 1 – margin is too small Ambiguous – run additive error algorithm to color Red 2 to 1 – margin is too small Blue 4 to 0 A A Red 2 to 1 – margin is too small C C B D B D E Blue 3 to 0 E Red 2 to 0 F F x1 x2 is greedy w.r.t. x1

  14. Plan of analysis • Main Lemma: (≈ Lemma 16) • Coloring x2 agrees with the optimal coloring x* • The additive error Err=κ3 ε n · (# ambiguous) is at most ε OPT

  15. Proof (1/3): Bounding OPT Optimum assignment x* • Assume all degrees are at least δ n • Vertex v is balanced if its margin w.r.t. x* is at most δ n / 3. • Lemma 12: #(balanced vert.) ≤ 6 OPT / (δ n) • Proof: • If v is balanced then v is incident in x* to at leastδ n / 3 uncut edges • OPT = ½∑v #(uncut edges incident to v) ≥ ½∑v balanced #(uncut edges incident to v) ≥ ½ #(balanced vert.) (δn / 3) F D C B A E G Balanced: 1≈3

  16. Proof (2/3): relating x1 to OPT coloring • Lemma 14: with probability at least 90% at most δ n / 24 vertices are colored different colors in x1 and x* • Proof: • Corollary: with probability at least 90% all vertices have margin w.r.t. x* within δ n / 12 of margin w.r.t. x1 Case 1: balanced vertices By Lemma 1 #(balanced) ≤ 6 OPT / (δ n) ≤ 6 (k1 n2) / (δ n) = δ n / 48. Case 2: unbalanced vertices Chernoff and Markov bounds imply that the number unbalanced vertices is at most δ n / 48.

  17. Proof (3/3): Proof of main lemma Proof that x2 agrees with the optimal coloring x* Assume v is colored by x2 Then v has a big margin w.r.to x1 Then by Corollary v is colored by x* in the same way as by x2 Proof that the additive errorErr=κ3 ε n · (# ambiguous) is at most ε OPT Assume v is not colored by x2 (ambiguous) Then v has a small margin w.r.to x1 Then by Corollary v has small margin w.r.to x* (balanced) So (# ambiguous) ≤ (# balanced) Bound (# ambiguous) by (# balanced) in Err, and use Lemma 12 to get Err ≤ ε OPT.

  18. Correlation Clustering with ≤ d clusters • Previous best PTAS runtime nO(1/ε²) [Giotis & Guruswami ’06] • We give PTAS with runtime n2·2O(1/ε²) (linear time) • Cor. Clust. constraints not fragile for d>2, but it satisfies a generalization we call rigidity

  19. Correlation Clustering and Rigidity • Definition of rigid CSP: in any assignment, a vertex in a large cluster is either incident to many incorrect edges or would be incident to many if moved to any other cluster. • Fragility implies rigidity • Key additional algorithmic technique (also used in [GG 06]): after identifying some clear-cut variables fix them and recurse on the remaining variables = = = = = = v

  20. Directions • More applications of the fragility and rigidity methods for other minimization problems. Might require generalizing the notion of rigidity to k-CSP problems. • Improving runtimes for Correlation Clustering, replacing "·" with "+" in O(n2)·2O(1/ε²) • Designing linear time (1 + ε)-approximation algorithms for the k-Clustering (MIN-SUM) problem.

  21. Bonus slides

  22. MIN-3-UNCUT Uncut (monochromatic) edge • MIN-3-UNCUT constraints are not fragile • Dense MIN-3-UNCUT is at least as hard as general MIN-2-UNCUT so no PTAS unless P=NP General MIN-2-UNCUT instance Dense MIN-3-UNCUT instance 10n2vert. Reduction 10n2 vert. n vertices n vertices 10n2vert. Complete tripartite graph

More Related