1 / 43

CS200: Algorithm Analysis

Dive deep into Prim's and Kruskal's algorithms for constructing Minimum Spanning Trees efficiently. Follow real-world examples and learn the analysis behind these greedy algorithms step by step.

fpauline
Download Presentation

CS200: Algorithm Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS200: Algorithm Analysis

  2. PRIM’s ALGORITHM Greedy algorithm. A is the MST being constructed. Thus A is always a Tree. Starts from an arbitrary root. At each step find a light edge crossing the cut (Va, V-Va) where Va = vertices that A is incident on. Add this edge to A. (see image, next slide) How to find the light edge? Idea: Keep V - A in a priority queue, sorted by weight of min-weight edge connecting to A.

  3. Prim’s Algorithm

  4. Trace algorithm for example graph.

  5. Trace algorithm for example graph.

  6. Trace algorithm for example graph.

  7. Trace algorithm for example graph.

  8. Trace algorithm for example graph.

  9. Trace algorithm for example graph.

  10. Trace algorithm for example graph.

  11. Trace algorithm for example graph.

  12. Trace algorithm for example graph.

  13. Trace algorithm for example graph.

  14. Trace algorithm for example graph.

  15. Trace algorithm for example graph.

  16. Trace algorithm for example graph.

  17. Analysis of Prim’s

  18. Analysis of Prim’s

  19. Analysis of Prim’s

  20. Analysis of Prim’s

  21. Analysis of Prim’s

  22. Analysis of Prim’s

  23. Analysis of Prim’s

  24. Analysis of Prim’s

  25. Analysis of Prim’s

  26. Analysis of Prim’s

  27. Analysis of Prim’s

  28. 1. Array :(unsorted)-> scan to find min, just index and update to change keys. 2. Binary heap: requires reheapify on ExtractMin and DecreaseKey.

  29. KRUSKAL’S ALGORITHM Also a greedy algorithm. Idea: uses the notion of Disjoint Set Union (used for Lab4) Disjoint Set: S is a disjoint set = {S1,S2,...,Sn} st Si  Sj = . S is a set of sets, each member set is disjoint. Each set can be viewed as an equivalence class - where all members are equivalent and are represented by one “unique” member.

  30. Operations on a Disjoint Set: • MakeSet(x) S <– S  {{x}}; {x} is not in any set Si of S. x is called the representative of the set. • Union(Si, Sj) S<– S – {Si, Sj}  {Si  Sj}; make new representative from Si and Sj. • FindSet(x) returns Si st x in Si and Si in S. These operations are used in Kruskal’s algorithm.

  31. MSTKruskal(G,W) //uses the Union-Find algorithm T = empty set //T will be MST when done for each v in V do MakeSet(v) //put each vertex in its own equivalence class // make v the class representative sort edges in W by increasing edge weight w for each edge (u,v) in W do //in sorted order if FindSet(u) != FindSet(v) then T = T union {(u,v)} Union( FindSet(u),FindSet(v)) return T

  32. Trace algorithm on example in html notes. • The above disjoint set operations take O(E* a(V,E)) where a is very slow growing (using best known algorithm for disjoint set union–> SS 21.3). • Overall runtime is O(ElgE) which is almost linear if edges are already sorted.

  33. Implementations of a Disjoint Set: Linked List => lots of traversing

  34. Implementations of a Disjoint Set: Augmented Linked List: weight is used to link smaller list into larger one for Union operation, each node points back to head so it is faster to locate the set representative for Find-Set.

  35. Implementations of a Disjoint Set: Forest of Trees:

  36. Implementations of a Disjoint Set: Forest of Trees: Union by Rank, smaller tree joined into larger tree but this can lead to unbalanced trees.

  37. Implementations of a Disjoint Set: Forest of Trees: Balancing the Trees, the Find-Set only needs to traverse one edge in a flat tree.

  38. Implementations of a Disjoint Set: Forest of Trees: Balancing the Trees using Path Compression

  39. Implementations of a Disjoint Set: The punch-line of this discussion is that, taken together, union by rank and path compression produce a spectacularly efficient implementation of the disjoint-set data structure. Theorem 16.2. On a disjoint-set forest with union by rank and path compression, any sequence of m operations, n of which are MAKE-SET operations, has worst-case running time Θ 􏰞mα(n)􏰟 , where α is the inverse Ackermann function. Thus, the amortized worst-case running time (see next slide) of each operation is Θ(α(n)). If one makes the approximation α(n) = O(1), which is valid for literally all conceivable purposes, then the operations on a disjoint-set forest have O(1) amortized running time. Because the Ackermann function is an extremely rapidly growing function, the inverse Ackermann function α is an extremely slow growing function (though it is true that lim n→∞ α(n) = ∞).

  40. Amortized Runtime Example

  41. Ackermann’s Function

  42. Inverse Ackermann’s Function

More Related