1 / 24

Chapter 9

Chapter 9. Greedy Technique. Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible - has to satisfy the problem’s constraints

zohar
Download Presentation

Chapter 9

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 9 Greedy Technique

  2. Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: • feasible - has to satisfy the problem’s constraints • locally optimal - has to be the best choice among all feasible choices available at that step • Irrevocable – once made, it cannot be changed on subsequent steps of the algorithm For some problems, yields an optimal solution for every instance. For most, does not but can be useful for fast approximations.

  3. Applications of the Greedy Strategy • Optimal solutions: • change making for “normal” coin denominations • minimum spanning tree (MST) • single-source shortest paths • simple scheduling problems • Huffman codes • Approximations: • traveling salesman problem (TSP) • knapsack problem • other combinatorial optimization problems

  4. Change-Making Problem Given unlimited amounts of coins of denominations d1 > … > dm , give change for amount n with the least number of coins Example: d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c and n = 48c Greedy solution: Greedy solution is • optimal for any amount and “normal’’ set of denominations • may not be optimal for arbitrary coin denominations

  5. Minimum Spanning Tree (MST) • Spanning tree of a connected graph G: a connected acyclic subgraph of G that includes all of G’s vertices • Minimum spanning tree of a weighted, connected graph G: a spanning tree of G of minimum total weight Example: 6 c a 1 4 2 d b 3

  6. Prim’s MST algorithm • Start with tree T1 consisting of one (any) vertex and “grow” tree one vertex at a time to produce MST through a series of expanding subtrees T1, T2, …, Tn • On each iteration, construct Ti+1 from Ti by adding vertex not in Ti that is closest to those already in Ti (this is a “greedy” step!) • Stop when all vertices are included

  7. 4 c a 1 6 2 d b 3 Example

  8. Notes about Prim’s algorithm • Proof by induction that this construction actually yields MST • Needs priority queue for locating closest fringe vertex • Efficiency • O(n2)for weight matrix representation of graph and array implementation of priority queue • O(m log n) for adjacency list representation of graph with n vertices and m edges and min-heap implementation of priority queue

  9. Induction Basis step: T0 consists of a single vertex and hence must be part of any minimum spanning tree. Inductive step: Assume that Ti-1 is part of some minimum spanning tree. Need to prove that Ti , generated from Ti-1 by Prim’s algorithm, is also a part of a minimum spanning tree. Proof by contradiction: assume that no minimum spanning tree of the graph can contain Ti Let ei =(v, u) be the minimum weight edge in Ti-1 to a vertex not in Ti-1 used by Prim’s algorithm to expand Ti-1 to Ti . By our assumption, ei cannot belong to the minimum spanning tree T. Therefore, if we add ei to T, a cycle must be formed.

  10. Another greedy algorithm for MST: Kruskal’s • Sort the edges in nondecreasing order of lengths • “Grow” tree one edge at a time to produce MST through a series of expanding forests F1, F2, …, Fn-1 • On each iteration, add the next edge on the sorted list unless this would create a cycle. (If it would, skip the edge.)

  11. 4 c a 1 6 2 d b 3 Example

  12. Notes about Kruskal’s algorithm • Algorithm looks easier than Prim’s but is harder to implement (checking for cycles!) • Cycle checking: a cycle is created iff added edge connects vertices in the same connected component • Union-find algorithms – see section 9.2

  13. Kruskal’s Algorithm re-thought • View the algorithm’s operations as a progression through a series of forests • Containing all of the vertices of a graph and • Some its edges • The initial forest consists of |V| trivial trees, each comprising a single vertex of the graph • On each iteration the algorithm takes the next edge (u,v) from the sorted list of the graph’s edges • Finds the trees containing the vertices u and v, and • If these trees are not the same, unites them in a larger tree by adding the edge (u,v) • With an efficient sorting algorithm, Kruskal is O(|E| log |E|)

  14. Kruskal and Union-Find • Union-find algorithms for checking whether two vertices belong to the same tree • Partition the n-element set S into a collection of n one-element subsets, each containing a different element of S • The collection is subjected to a sequence of intermixed union and find operations • ADT: • makeset(x) creates a one element set {x} • find(x) returns a subset contain x • union (x, y) constructs the union of disjoint subsets Sx and Sy, containing x and y, and adds it to the collection to replace Sx and Sy which are deleted from it

  15. Minimum spanning tree vs. Steiner tree 1 a c a c vs 1 1 b d b d 1 See problem 11 on page 323

  16. Shortest paths – Dijkstra’s algorithm Single Source Shortest Paths Problem: Given a weighted connected graph G, find shortest paths from source vertex s to each of the other vertices Dijkstra’s algorithm: Similar to Prim’s MST algorithm, with a different way of computing numerical labels: Among vertices not already in the tree, it finds vertex u with the smallest sum dv + w(v,u) where v is a vertex for which shortest path has been already foundon preceding iterations (such vertices form a tree) dvis the length of the shortest path from source to vw(v,u) is the length (weight) of edge from v to u

  17. Dijkstra’s Algorithm (G,l) G is the graph, s is the start node l is the length of an edge Let S be the set of explored nodes For each u in S, we store the distance d(u) Initially S = {s} and d(s) = 0 While S is not equal to V Select a node v not in S with at least one edge from S for which d’(v) = min e=(u,v):u is in S d(u) + le is as small as possible Add v to S and define d(v) = d’(v) EndWhile

  18. 4 b c 3 6 2 5 7 a e 4 b c 3 6 2 5 a e d 7 4 Example 4 d d Tree vertices Remaining vertices a(-,0) b(a,3) c(-,∞) d(a,7) e(-,∞) 4 b(a,3) c(b,3+4) d(b,3+2) e(-,∞) b c 3 6 5 2 a d e 7 4 d(b,5) c(b,7) e(d,5+4) 4 b c 3 6 5 2 a d e 7 4 4 c(b,7) e(d,9) b c 3 6 2 5 a d e 7 4 e(d,9)

  19. Notes on Dijkstra’s algorithm • Doesn’t work for graphs with negative weights • Applicable to both undirected and directed graphs • Efficiency • O(|V|2) for graphs represented by weight matrix and array implementation of priority queue • O(|E|log|V|) for graphs represented by adj. lists and min-heap implementation of priority queue • Don’t mix up Dijkstra’s algorithm with Prim’s algorithm!

  20. Coding Problem Coding: assignment of bit strings to alphabet characters Codewords: bit strings assigned for characters of alphabet Two types of codes: • fixed-length encoding (e.g., ASCII) • variable-length encoding (e,g., Morse code) Prefix-free codes: no codeword is a prefix of another codeword Problem: If frequencies of the character occurrences are known, what is the best binary prefix-free code?

  21. Huffman Codes • Labeling the tree T*: • We label all the leaves of depth 1 (if there are any) and label them with the highest-frequency letters in any order. • We then take all leaves of depth 2 (if there are any) and label them with the next highest-frequency letters in any order. • Continue through the leaves in order of increasing depth • There is an optimal prefix code, with corresponding tree T* , in which the two lowest-frequency letters are assigned to leaves that are siblings in T*. • Delete them and label their parent with a new letter having the combined frequency • Yields an instance with a smaller alphabet

  22. Huffman’s Algorithm To construct a prefix code for an alphabet S, with given frequencies: If S has two letters then encode one letter using 0 and other letter using 1 Else Let y* and z* be the two lowest-frequency letters Form a new alphabet S’ by deleting y* and z* and replacing them with a new letter w of frequency f y* + f z* Recursively construct a prefix code g’ for S’, with tree T’ Define a prefix code for S as follows: Start with T’ Take the leaf labeled w and add two children below it labeled y* and z* Endif0…………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………….

  23. Huffman codes • Any binary tree with edges labeled with 0’s and 1’s yields a prefix-free code of characters assigned to its leaves • Optimal binary tree minimizing the expected (weighted average) length of a codeword can be constructed as follows Huffman’s algorithm Initialize n one-node trees with alphabet characters and the tree weights with their frequencies. Repeat the following step n-1 times: join two binary trees with smallest weights into one (as left and right subtrees) and make its weight equal the sum of the weights of the two trees. Mark edges leading to left and right subtrees with 0’s and 1’s, respectively.

  24. Example character A B C D _ frequency 0.35 0.1 0.2 0.2 0.15 codeword 11 100 00 01 101 average bits per character: 2.25 for fixed-length encoding: 3 compression ratio: (3-2.25)/3*100% = 25%

More Related