680 likes | 918 Views
Optimization Problems. Optimization problem: a problem of finding the best solution from all feasible solutions. Two common techniques: Greedy Algorithms (local) Dynamic Programming (global). Greedy Algorithms. Greedy algorithms typically consist of A set of candidate solutions
E N D
Optimization Problems • Optimization problem: a problem of finding the best solution from all feasible solutions. • Two common techniques: • Greedy Algorithms (local) • Dynamic Programming (global)
Greedy Algorithms Greedy algorithms typically consist of • A set of candidate solutions • Function that checks if the candidates are feasible • Selection function indicating at a given time which is the most promising candidate not yet used • Objective functiongiving the value of a solution; this is the function we are trying to optimize
Examples of Greedy Algorithms • Graph Algorithms • Breath First Search (shortest path 4 un-weighted graph) • Dijkstra’s (shortest path) Algorithm • Minimum Spanning Trees • Data compression • Huffman coding • Scheduling • Activity Selection • Minimizing time in system • Deadline scheduling • Other Heuristics • Coloring a graph • Traveling Salesman • Set-covering
Elements of Greedy Strategy • Greedy-choice property: A global optimal solution can be arrived at by making locally optimal (greedy) choices • Optimal substructure: an optimal solution to the problem contains within it optimal solutions to sub-problems • Be able to demonstrate that if A is an optimal solutioncontainings1, then the setA’ = A - {s1} is an optimal solution to a smaller problem w/os1.
Analysis • The selection function is usually based on the objective function; they may be identical. But, often there are several plausible ones. • At every step, the procedure chooses the best candidate, without worrying about the future. It never changes its mind: once a candidate is included in the solution, it is there for good; once a candidate is excluded, it’s never considered again. • Greedy algorithms do NOT always yield optimal solutions, but for many problems they do.
Breadth-First Traversal • A breadth-first traversal • visits a vertex and • then each of the vertex's neighbors • before advancing Cost ? O(|V|+|E|)
2 3 3 2 2 3 1 S S S 2 1 1 2 3 2 3 Finished Discovered Undiscovered Breadth-First Traversal Implementation?
Example (BFS) r s t u 0 v w y x Q: s 0
Example (BFS) r s t u 0 1 1 v w y x Q: w r 1 1
Example (BFS) r s t u 2 0 1 2 1 v w y x Q: r t x 1 2 2
Example (BFS) r s t u 2 0 1 2 2 1 v w y x Q: t x v 2 2 2
Example (BFS) r s t u 2 0 3 1 2 2 1 v w y x Q: x v u 2 2 3
Example (BFS) r s t u 2 0 3 1 2 3 2 1 v w y x Q: v u y 2 3 3
Example (BFS) r s t u 2 0 3 1 2 3 2 1 v w y x Q: u y 3 3
Example (BFS) r s t u 2 0 3 1 2 3 2 1 v w y x Q: y 3
Example (BFS) r s t u 2 0 3 1 2 3 2 1 v w y x Q:
Example (BFS) r s t u 2 0 3 1 2 3 2 1 v w y x Breath First Tree
r r s s t t u u 2 0 0 3 1 2 3 2 1 v v w w y y x x BFS : Application • Solves the shortest path problem for un-weighted graph
Dijkstra’s algorithm (4.4) 1 10 9 2 3 s 4 6 7 5 2 y x An adaptation of BFS
Example u v 1 10 9 2 3 0 s 4 6 7 5 2 y x
Example u v 1 10 10 9 2 3 0 s 4 6 7 5 5 5 2 y x
Example u v 1 8 14 10 9 2 3 0 s 4 6 7 5 7 7 5 2 y x
Example u v 1 8 8 13 10 9 2 3 0 s 4 6 7 5 7 5 2 y x
Example u v 1 9 8 9 10 9 2 3 0 s 4 6 7 5 7 5 2 y x
Example u v 1 8 9 10 9 2 3 0 s 4 6 7 5 7 5 2 y x
Dijkstra’s Algorithm • Assumes no negative-weight edges. • Maintains a set S of vertices whose shortest path from s has been determined. • Repeatedly selects u in V – S with minimum shortest path estimate (greedy choice). • Store V – S in priority queue Q. Cost: ? O(V2)
Minimal Spanning Trees Minimal Spanning Tree (MST) Problem: Input: An undirected, connected graph G. Output: The subgraph of G that • keeps the vertices connected; • has minimum total cost; (the sum of the values of the edges in the subset is at the minimum)
Greedy Algorithms • Kruskal's algorithm. Start with T = . Consider edges in ascending order of cost. Insert edge e in T unless doing so would create a cycle. • Prim's algorithm. Start with some root node s and greedily grow a tree T from s outward. At each step, add the cheapest edge e to T that has exactly one endpoint in T. • Reverse-Delete algorithm. Start with T = E. Consider edges in descending order of cost. Delete edge e from T unless doing so would disconnect T. • All three algorithms produce an MST.
Kruskal’s MST Algorithm • choose each vertex to be in its own MST; • merge two MST’s that have the shortest edge between them; • repeat step 2 until no tree to merge. • Implementation ? • Union-Find
Prim’s MST Algorithm A greedy algorithm. • choose any vertex N to be the MST; • grow the tree by picking the least cost edge connected to any vertices in the MST; • repeat step 2 until the MST includes all the vertices.
Prim’s MST demo http://www-b2.is.tokushimau.ac.jp/~ikeda/ suuri/dijkstra/PrimApp.shtml?demo1
Huffman Coding Huffman codes –- very effective technique for compressing data, saving 20% - 90%.
Coding Problem: • Consider a data file of 100,000 characters • You can safely assume that there are many a,e,i,o,u, blanks, newlines, few q, x, z’s • Want to store it compactly Solution: • Fixed-length code, ex. ASCII, 8 bits per character • Variable length code, Huffman code (Can take advantage of relative freq of letters to save space)
000 360 001 126 010 126 011 111 100 96 101 72 110 21 111 6 918 Example • Fixed-length code, need ? bits for each char 3 Char Frequency Code Total Bits E 120 L 42 D 42 U 37 C 32 M 24 K 7 Z 2
37 E:120 L:42 D:42 U:37 C:32 M:24 K:7 Z:2 Example (cont.) Char Code 0 1 0 1 1 0 0 1 0 1 0 1 0 1 Complete binary tree
Example (cont.) • Variable length code (Can take advantage of relative freq of letters to save space) - Huffman codes Char Code
Huffman Tree Construction (1) • Associate each char with weight (= frequency) to form a subtree of one node (char, weight) • Group all subtrees to form a forest • Sort subtrees by ascending weight of subroots • Merge the first two subtrees (ones with lowest weights) • Assign weight of subroot with sum of weights of two children. • Repeat 3,4,5 until only one tree in the forest
M Huffman Tree Construction (3)
char Freq C 32 128 D 42 126 E 120 120 M 24 120 K 7 42 L 42 126 U 37 111 Z 2 12 785 Assigning Codes Compare with: 918 ~15% less Code Bits 1110 101 0 11111 111101 110 100 111100
Coding and Decoding Char Code • DEED: • MUCK: 010000000010 101011100110 Char Code • DEED: • MUCK: 10100101 111111001110111101
Prefix codes A set of codes is said to meet the prefix property if no code in the set is the prefix of another. Such codes are called prefix codes. Huffman codes are prefix codes. Char Code
Coin Changing • Goal. Given currency denominations: 1, 5, 10, 25, 100, devise a method to pay amount to customer using fewest number of coins. • Ex: 34¢. • Cashier's algorithm. At each iteration, add coin of the largest value that does not take us past the amount to be paid. • Ex: $2.89.
Sort coins denominations by value: c1 < c2 < … < cn. S while (x 0) { let k be largest integer such that ck x if (k = 0) return "no solution found" x x - ck S S {k} } return S coins selected Coin-Changing: Greedy Algorithm • Cashier's algorithm. At each iteration, add coin of the largest value that does not take us past the amount to be paid. • Q. Is cashier's algorithm optimal?
Coin-Changing: Analysis of Greedy Algorithm • Observation. Greedy algorithm is sub-optimal for US postal denominations: 1, 10, 21, 34, 37, 44, 70, 100, 350, 1225, 1500. • Counterexample. 140¢. • Greedy: 100, 37, 1, 1, 1. • Optimal: 70, 70. Greedy algorithm failed!
k ck All optimal solutionsmust satisfy Max value of coins1, 2, …, k-1 in any OPT 1 1 P 4 - 2 5 N 1 4 3 10 N + D2 4 + 5 = 9 4 25 Q 3 20 + 4 = 24 5 100 no limit 75 + 24 = 99 Coin-Changing: Analysis of Greedy Algorithm • Theorem. Greed is optimal for U.S. coinage: 1, 5, 10, 25, 100. • Proof. (by induction on x) • Let ck be the kth smallest coin • Consider optimal way to change ck x < ck+1 : greedy takes coin k. • We claim that any optimal solution must also take coin k. • if not, it needs enough coins of type c1, …, ck-1to add up to x • table below indicates no optimal solution can do this • Problem reduces to coin-changing x - ck cents, which, by induction, is optimally solved by greedy algorithm.