1 / 32

Prof. Swarat Chaudhuri

COMP 482: Design and Analysis of Algorithms. Prof. Swarat Chaudhuri. Spring 2013 Lecture 20. Recap: Project Selection. can be positive or negative. Projects with prerequisites. Set P of possible projects. Project v has associated revenue p v .

raquel
Download Presentation

Prof. Swarat Chaudhuri

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. COMP 482: Design and Analysis of Algorithms Prof. Swarat Chaudhuri Spring 2013 Lecture 20

  2. Recap: Project Selection can be positive or negative • Projects with prerequisites. • Set P of possible projects. Project v has associated revenue pv. • some projects generate money: create interactive e-commerce interface, redesign web page • others cost money: upgrade computers, get site license • Set of prerequisites E. If (v, w)  E, can't do project v and unless also do project w. • A subset of projects A  P is feasible if the prerequisite of every project in A also belongs to A. • Project selection. Choose a feasible subset of projects to maximize revenue.

  3. Recap: Project Selection: Prerequisite Graph • Prerequisite graph. • Include an edge from v to w if can't do v without also doing w. • {v, w, x} is feasible subset of projects. • {v, x} is infeasible subset of projects. w w v x v x feasible infeasible

  4. 7.10 Image Segmentation

  5. Image Segmentation • Image segmentation. • Central problem in image processing. • Divide image into coherent regions. • Ex: Three people standing in front of complex background scene. Identify each person as a coherent object.

  6. Image Segmentation • Foreground / background segmentation. • Label each pixel in picture as belonging toforeground or background. • V = set of pixels, E = pairs of neighboring pixels. • ai  0 is likelihood pixel i in foreground. • bi  0 is likelihood pixel i in background. • pij  0 is separation penalty for labeling one of iand j as foreground, and the other as background. • Goals. • Accuracy: if ai > bi in isolation, prefer to label i in foreground. • Smoothness: if many neighbors of i are labeled foreground, we should be inclined to label i as foreground. • Find partition (A, B) that maximizes: foreground background

  7. Image Segmentation • Properties of the problem. • Maximization. • No source or sink. • Undirected graph. • Turn into minimization problem. • Maximizingis equivalent to minimizing • or alternatively

  8. Q1 • …Can you solve this problem using max flow or min-cut?

  9. Image Segmentation • Formulate as min cut problem. • G' = (V', E'). • Add source to correspond to foreground;add sink to correspond to background • Use two anti-parallel edges instead ofundirected edge. pij pij pij aj pij i j s t bi G'

  10. Image Segmentation • Consider min cut (A, B) in G'. • A = foreground. • Precisely the quantity we want to minimize. if i and j on different sides, pij counted exactly once aj pij i j s t bi A G'

  11. 7.12 Baseball Elimination

  12. Baseball Elimination • Which teams have a chance of finishing the season with most wins? • Montreal eliminated since it can finish with at most 80 wins, but Atlanta already has 83. • wi + ri < wj team i eliminated. • Only reason sports writers appear to be aware of. • Sufficient, but not necessary! Teami Winswi Lossesli To playri Against = rij Atl Phi NY Mon Atlanta 83 71 8 - 1 6 1 Philly 80 79 3 1 - 0 2 New York 78 78 6 6 0 - 0 Montreal 77 82 3 1 2 0 -

  13. Baseball Elimination • Which teams have a chance of finishing the season with most wins? • Philly can win 83, but still eliminated . . . • If Atlanta loses a game, then some other team wins one. • Remark. Answer depends not just on how many games already won and left to play, but also on whom they're against. Teami Winswi Lossesli To playri Against = rij Atl Phi NY Mon Atlanta 83 71 8 - 1 6 1 Philly 80 79 3 1 - 0 2 New York 78 78 6 6 0 - 0 Montreal 77 82 3 1 2 0 -

  14. Baseball Elimination • Baseball elimination problem. • Set of teams S. • Distinguished team s  S. • Team x has won wx games already. • Teams x and y play each other rxy additional times. • Is there any outcome of the remaining games in which team s finishes with the most (or tied for the most) wins?

  15. Baseball Elimination: Max Flow Formulation • Can team 3 finish with most wins? • Assume team 3 wins all remaining games  w3 + r3wins. • Divvy remaining games so that all teams have  w3 + r3 wins. 1-2 1 team 4 can stillwin this manymore games 1-4 games left 2  1-5 2-4  w3 + r3 - w4 s t 4 r24 = 7 2-5 5 4-5 game nodes team nodes

  16. Baseball Elimination: Max Flow Formulation • Theorem. Team 3 is not eliminated iff max flow saturates all edges leaving source. • Integrality theorem  each remaining game between x and y added to number of wins for team x or team y. • Capacity on (x, t) edges ensure no team wins too many games. 1-2 1 team 4 can stillwin this manymore games 1-4 games left 2  1-5 2-4  w3 + r3 - w4 s t 4 r24 = 7 2-5 5 4-5 game nodes team nodes

  17. NP-completeness and computational intractability

  18. Algorithm Design Patterns and Anti-Patterns • Algorithm design patterns. Ex. • Greed. O(n log n) interval scheduling. • Divide-and-conquer. O(n log n) FFT. • Dynamic programming. O(n2) edit distance. • Duality. O(n3) bipartite matching. • Reductions. • Local search. • Randomization. • Algorithm design anti-patterns. • NP-completeness. O(nk) algorithm unlikely. • PSPACE-completeness. O(nk) certification algorithm unlikely. • Undecidability. No algorithm possible.

  19. 8.1 Polynomial-Time Reductions

  20. Classify Problems According to Computational Requirements • Q. Which problems will we be able to solve in practice? • A working definition. [Cobham 1964, Edmonds 1965, Rabin 1966] Those with polynomial-time algorithms. Yes Probably no Shortest path Longest path Matching 3D-matching Min cut Max cut 2-SAT 3-SAT Planar 4-color Planar 3-color Bipartite vertex cover Vertex cover Primality testing Factoring

  21. Classify Problems • Desiderata. Classify problems according to those that can be solved in polynomial-time and those that cannot. • Provably requires exponential-time. • Given a Turing machine, does it halt in at most k steps? • Given a board position in an n-by-n generalization of chess, can black guarantee a win? • Frustrating news. Huge number of fundamental problems have defied classification for decades. • This chapter. Show that these fundamental problems are "computationally equivalent" and appear to be different manifestations of one really hard problem.

  22. Polynomial-Time Reduction • Desiderata'. Suppose we could solve X in polynomial-time. What else could we solve in polynomial time? • Reduction. Problem X polynomial reduces to problem Y if arbitrary instances of problem X can be solved using: • Polynomial number of standard computational steps, plus • Polynomial number of calls to oracle that solves problem Y. Notation. X  P Y. Remarks. • We pay for time to write down instances sent to black box  instances of Y must be of polynomial size. • Note: Cook reducibility. don't confuse with reduces from computational model supplemented by special pieceof hardware that solves instances of Y in a single step in contrast to Karp reductions

  23. Polynomial-Time Reduction • Purpose. Classify problems according to relative difficulty. • Design algorithms. If X  P Y and Y can be solved in polynomial-time, then X can also be solved in polynomial time. • Establish intractability. If X  P Y and X cannot be solved in polynomial-time, then Y cannot be solved in polynomial time. • Establish equivalence. If X  P Y and Y  P X, we use notation X  P Y. up to cost of reduction

  24. Reduction By Simple Equivalence Basic reduction strategies. Reduction by simple equivalence. Reduction from special case to general case. Reduction by encoding with gadgets.

  25. Independent Set • INDEPENDENT SET: Given a graph G = (V, E) and an integer k, is there a subset of vertices S  V such that |S|  k, and for each edge at most one of its endpoints is in S? • Ex. Is there an independent set of size  6? Yes. • Ex. Is there an independent set of size  7? No. independent set

  26. Vertex Cover • VERTEX COVER: Given a graph G = (V, E) and an integer k, is there a subset of vertices S  V such that |S|  k, and for each edge, at least one of its endpoints is in S? • Ex. Is there a vertex cover of size  4? Yes. • Ex. Is there a vertex cover of size  3? No. vertex cover

  27. Vertex Cover and Independent Set • Claim. VERTEX-COVERPINDEPENDENT-SET. • Pf. We show S is an independent set iff V  S is a vertex cover. independent set vertex cover

  28. Vertex Cover and Independent Set • Claim. VERTEX-COVERPINDEPENDENT-SET. • Pf. We show S is an independent set iff V  S is a vertex cover. •  • Let S be any independent set. • Consider an arbitrary edge (u, v). • S independent  u  S or v  S  u  V  S or v  V  S. • Thus, V  S covers (u, v). •  • Let V  S be any vertex cover. • Consider two nodes u  S and v  S. • Observe that (u, v)  E since V  S is a vertex cover. • Thus, no two nodes in S are joined by an edge  S independent set. ▪

  29. Reduction from Special Case to General Case Basic reduction strategies. Reduction by simple equivalence. Reduction from special case to general case. Reduction by encoding with gadgets.

  30. Set Cover • SET COVER: Given a set U of elements, a collection S1, S2, . . . , Sm of subsets of U, and an integer k, does there exist a collection of  k of these sets whose union is equal to U? • Sample application. • m available pieces of software. • Set U of n capabilities that we would like our system to have. • The ith piece of software provides the set Si U of capabilities. • Goal: achieve all n capabilities using fewest pieces of software. • Ex: U = { 1, 2, 3, 4, 5, 6, 7 }k = 2 S1 = {3, 7} S4 = {2, 4} S2 = {3, 4, 5, 6} S5 = {5}S3 = {1} S6 = {1, 2, 6, 7}

  31. Vertex Cover Reduces to Set Cover • Claim. VERTEX-COVER PSET-COVER. • Pf. Given a VERTEX-COVER instance G = (V, E), k, we construct a set cover instance whose size equals the size of the vertex cover instance. • Construction. • Create SET-COVER instance: • k = k, U = E, Sv = {e  E : e incident to v } • Set-cover of size  k iff vertex cover of size  k. ▪ SET COVER U = { 1, 2, 3, 4, 5, 6, 7 }k = 2 Sa = {3, 7} Sb = {2, 4} Sc = {3, 4, 5, 6} Sd = {5}Se = {1} Sf= {1, 2, 6, 7} VERTEX COVER a b e7 e4 e2 e3 f c e6 e5 e1 k = 2 d e

  32. Q2: Hitting set • HITTING SET: Given a set U of elements, a collection S1, S2, . . . , Sm of subsets of U, and an integer k, does there exist a subset of U of size  k such that U overlaps with each of the sets S1, S2, . . . , Sm? • Show that SET COVER polynomial reduces to HITTING SET.

More Related