1 / 67

Dealing with NP-C problems

Randomized Algorithm. Dealing with NP-C problems. NP-Complete Problem. A problem that, right now, we need exhaustive search Example: SAT TSP Vertex Cover Etc. Travelling Salesman Problem (TSP). Find a sequence of city to visit Minimize the cost of travel. Example. Example.

elsa
Download Presentation

Dealing with NP-C problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Randomized Algorithm Dealing with NP-C problems

  2. NP-Complete Problem • A problem that, right now, we need exhaustive search • Example: • SAT • TSP • Vertex Cover • Etc.

  3. Travelling Salesman Problem (TSP) • Find a sequence of city to visit • Minimize the cost of travel

  4. Example

  5. Example

  6. General Search Algorithm whilehas_unexplored_solution() { Solution sol = gen_another_solution() int cost = evaluate(sol) if (cost < min_cost) { min_cost = cost; min_sol = sol; } } Time = number of solutions * evaluation

  7. TSP Solution Space • N cities • (N-1)! / 2 • assume symmetric graph

  8. What to do if N is too large?

  9. Relaxation • In practice, we don’t really need the “best” solution • But we need it now • Something close to now (1 min, 1 hour, etc.. But not 100 years) • Just something being “near” the best solution

  10. Approximation Algorithm One who try to optimize everything is bounded to be unhappy

  11. Bounded solution • If solving for “optimum” solution requires exponential time • What polynomial time could give us? • What can we say about the solution from polynomial time?

  12. Approximation Ratio Problem Instance The “best” solution for I Our algorithm Approximation ratio, upper bound of our sub-optimal

  13. Approximation Algorithm • Algorithm that run fast (polynomial time) that give good approximation ratio • Reasonable choice when dealing with NP complete problem

  14. Clustering • Input: • Set of points X • Integer k • Output: • Partition of points into k set • Such that the diameter of each set is minimized

  15. Metric Property • A function d(x,y) such that • d(x,y) >= 0 • d(x,y) = 0 if and only if x = y • d(x,y) = d(y,x) • d(x,y) < d(x,z) + d(z,y) TriangularInequality

  16. Example

  17. Approximated Version

  18. Guarantee • Ratio = 2 • i.e., the resulting diameter is not more than twice of original

  19. Why ? • Let p be the point in X that is farthest from μ1, μ 2, μ 3,…, μ k • Let r be the distance from p to it’s closest center • Then… by triangular inequality, every cluster has diameter at most 2r

  20. So what? • How r relate to the optimal solution? • There are k + 1 points • μ1, μ 2, μ 3,…, μ k, p • Such that they are all at least r from the others • So, any partition into k set must have some set that contains at least two of tem • That set must has diameter at least r

  21. Approximated Euclidian TSP • TSP such that distance between two cities is a Metric • What is closely relate to TSP and can be easily compute?

  22. MST and TSP • Given an answer for TSP • It is a cycle • Remove one edge from the answer • The result is path that is also a spanning tree (not minimal) • Let p be that path

  23. From MST to TSP • Given an MST • Do DFS, the result of visiting is simply a cycle that visit every vertex • Also visit some vertex multiple times

  24. From MST to TSP • Length of that path is at most twice of the best TSP path • Fix the path into TSP • Simply skip the vertex that is about to re-visit and move to the next vertex in the list

  25. Fixing the Path By triangular inequality, the new path is shorter

  26. Approximated 01-Knapsack • Input • A number W, the capacity of the sack • n pairs of weight and price ((w1,p1),(w2,p2),…,(wn,pn)) • wi= weight of the ith items • pi= price of the ith item • Output • A subset S of {1,2,3,…,n} such that • is maximum

  27. Guarantee • Pick any ε > 0 • Result value is at least (1 – ε) of the maximum value

  28. Approximated 01-Knapsack • Knapsack can be solved using dynamic programming • Using O(nW) • W = sum of weight • We can derive similar algorithm using O(nV) where V = sum of values

  29. O(nV) knapsack • Let K(v)be the “minimal weight” when sum of selected value is v • If the ith item is in the best solution • K(v) = K(v – pi) + wi • But, we don’t really know that the ithitem is in the optimal solution • So, we try everything • K(v) = min1≤i ≤ n(K(v – pi) + wi)

  30. Approximation • Since it is O(nV) • Can we reduce V? • To improve running time

  31. Value scaling • Scale the price by a constant • Resulting price is at most n/ε • Thus, the running time is (n3/ε)

  32. The optimal solution • Let S be the selected subset of the optimal • Let K* be the maximum value • Find the result of the rescaled input

  33. Approximation Ratio • Rewrite in terms of K* • Let Ŝ be the set of selected rescaled item

  34. RANDOM SEARCH Search with bounded resource

  35. General Search Algorithm whiletime_not_exceed() { Solution sol = random_a_solution() int cost = evaluate(sol) if (cost < min_cost) { min_cost = cost; min_sol = sol; } } Time is bounded Best solution is not guaranteed

  36. Does it work? • If we have “enough” time… • Eventually, we will hit the “right” answer

  37. Hill Climbing It’s ok to be greed

  38. Can we improve Random Search? • Anything better than randomly generate a new answer?

  39. 0-1 Knapsack Problem • Pick a combination of items • Solution Space • 00000 (0) • 00001 (1) • 00010 (2) • 00011 (3) • 00100 (4) • . • . • 11111 (31) 32 solutions in total

  40. Evaluation Function eval (1) (2) (3) (4)(5) (6) … Solution

  41. Neighbor of Solution • In solution space • 00100 (4) is close to • 00011 (3) and • 00101 (5)

  42. Hill Climbing Search • Generate only the neighbor solution • Move to the best neighbor solution Solution sol = random_a_solution() whiletime_not_exceed() { Solution nb[] = gen_all_neighbors(sol) Solution nb_sol; intnb_min; For all x in nb { int cost = evaluate(x) if (cost < nb_min) { nb_min= cost; nb_sol = x; } } if (nb_min < min_cost) { min_cost = nb_min; sol = nb_sol; } } If cost does not improve, we might stop

  43. Several definition of neighbor

  44. Best Problem for Hill Climbing • Unimodal

  45. Bad problem for hill climbing • Multimodal

  46. Local Minima • Hill climbing is a local search • (solver can define their own “local”) • It could stuck at local minima • Need something to fix it • If stuck • Randomly generate another solution and start from that solution

  47. Simulated Annealing O’ mother nature, I worship thou

  48. Annealing • A material (metal) is heated and then slowly cooled down • Rearrangement of atom • Because, if not heated, atom is stuck at irregular position from inertia • That’s like the “Local Minima”

  49. Simulated Annealing • Just like hill climbing • But solution is allowed to move to lower position • With some probability • With inverse proportional to the elapsed time • Help escape from local minima

  50. Simulated Annealing Solution sol = random_a_solution() whiletime_not_exceed() { Solution nb = gen_one_neighbors(sol) int cost = evaluate(nb); if (cost < min_cost) { sol = nb; min_cost = cost; } else { if (random() < chance_at_time(t)) { sol = nb; min_cost = cost; } } }

More Related