1 / 52

Data Structure & Algorithm

Data Structure & Algorithm. 11 – Minimal Spanning Tree JJCAO. Steal some from Prof. Yoram Moses & Princeton COS 226 . Weighted Graphs. G =(V,E), wt wt : E → R wt (G) = . Sub-Graphs. Note: G' is not a spanning sub-graph of G. Minimum Spanning Tree. A Subgraph A tree Spans G

dyanne
Download Presentation

Data Structure & Algorithm

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Structure & Algorithm 11 – Minimal Spanning Tree JJCAO Steal some from Prof. YoramMoses & Princeton COS 226

  2. Weighted Graphs G =(V,E),wt wt: E → R wt(G) =

  3. Sub-Graphs Note: G' is not a spanning sub-graph of G

  4. Minimum Spanning Tree • A Subgraph • A tree • Spans G • Of minimal weight

  5. MST Origin OtakarBoruvka (1926). • Electrical Power Company of Western Moravia in Brno. • Most economical construction of electrical power network. • Concrete engineering problem is now a cornerstone problem in combinatorial optimization.

  6. MST describes arrangement of nuclei in the epithelium for cancer research http://www.bccrc.ca/ci/ta01_archlevel.html

  7. Normal Consistency • [Hoppe et al. 1992] • Based on angles between unsigned normals • May produce errors on close-by surface sheets

  8. MST is fundamental problem with diverse applications • Network design. • telephone, electrical, hydraulic, TV cable, computer, road • Approximation algorithms for NP-hard problems. • traveling salesperson problem, Steiner tree • Indirect applications. • max bottleneck paths • LDPC codes for error correction • image registration with Renyi entropy • learning salient features for real-time face verification • reducing data storage in sequencing amino acids in a protein • model locality of particle interactions in turbulent fluid flows • autoconfigprotocol for Ethernet bridging to avoid cycles in a network • Cluster analysis.

  9. Minimum Spanning Tree on Surface of Sphere 5000 Vertices

  10. Minimum Spanning Tree Input: a connected, undirected graph - G, with a weight function on the edges – wt Goal: find a Minimum-weight Spanning Tree for G Fact: If all edge weights are distinct, the MST is unique Brute force: Try all possible spanning trees • problem 1: not so easy to implement • problem 2: far too many of them Ex: [Cayley, 1889]: V^{V-2} spanning trees on the complete graph on V vertices.

  11. Main algorithms of MST • Kruskal’salgorithm • Prim’s algorithm Both O(ElgV) using ordinary binary heaps Both greedy algorithms => Global solution • …

  12. Two Greedy Algorithms • Kruskal's algorithm. Consider edges in ascending order of cost. Add the next edge to T unless doing so would create a cycle. • Prim's algorithm. Start with any vertex s and greedily grow a tree T from s. At each step, add the cheapest edge to T that has exactly one endpoint in T. Greed is good. Greed is right. Greed works. Greed clarifies, cuts through, and captures the essence of the evolutionary spirit." - Gordon Gecko

  13. Cycle Property • Let T be a minimum spanning tree of a weighted graph G • Let e be an edge of G that is not in T and C be the cycle formed by e with T • For every edge f of C, weight(f) ≤ weight(e) Proof: • By contradiction • If weight(f) > weight(e) we can get a spanning tree of smaller weight by replacing e with f

  14. Edgescross the cut

  15. Cut (/Partition) Property Lemma: Let G =(V,E) and X ⊂ V. If e = a lightest edge connecting X and V-X then e appears in some MST of G. Proof: • Let T be an MST of G • If T does not contain e, consider the cycle C formed by e with T and let f be an edge of C across the partition • By the cycle property, weight(f) ≤ weight(e) • Thus, weight(f) = weight(e) • We obtain another MST by replacing f with e globally optimal solution (MST) locally optimal choice (of lightest edges)

  16. Disjoint Set ADT

  17. An application of disjoint-set data structures

  18. Linked List Implementation

  19. Unionin Linked List Implementation

  20. Worst-Case Example • n: the number of MAKE-SET operations, • m: the total number of MAKE-SET, UNION, and FIND-SET operations • we can easily construct a sequence of m operations on n objects that requires (n^2) time

  21. Weighted Union Heuristic • Each set id includes the length of the list • In Union - append shorter list at end of longer Theorem: Performing m > n operations takes O(m + nlgn) time

  22. Simple Forest Implementation Find-Set(x) - follow pointers from x up to root Union(c,f) - make c a child of f and return f ∪

  23. Worst-Case Example n … 3 2 1

  24. Weighted Union Heuristic • Each node includes a weight field weight =# elements in sub-tree rooted at node • Find-Set(x) - as before O(depth(x)) • Union(x,y) - always attach smallertree below the rootof largertree O(1)

  25. Weighted Union Theorem: Any k-node tree created using the weighted-union heuristic, has height ≤ lg(k) Proof: By induction on k Find-Set Running Time: O(lgn)

  26. 2nd heuristic: Path Compression

  27. The function lgn lg n = the number of times we have to take the log2n repeatedly to reach root node Lg 2 = 1 Lg 2^2 = 2 Lg 2^16 = lg 65536 = 16 => Lg n < 16 for all practical values of n

  28. Theorem(Tarjan): If S= a sequence of O(n) Unions and Find-Sets The worst-case time for S with – Weighted Unions, and – Path Compressions is O(nlgn) The average time is O(lgn) per operation in Linked List Implementation

  29. Theorem(Tarjan): Let S= a sequence of O(n) Unions and Find-Sets The worst-case time for S with – Weighted Unions, and – Path Compressions is O(nα(n)) The average time is O(α(n)) per operation, α(n) < 5 in practice

  30. Connected Components usingUnion-Find Reminder: • Every node v is connected to itself • if u and v are in the same connected component then v is connected to u and u is connected to v • Connected components form a partition of the nodes and so are disjoint:

  31. MST-Kruskal Kruskal's algorithm for minimum spanning tree works by inserting edges in order of increasing cost, adding as edges to the tree those which connect two previously disjoint components. The minimum spanning tree describes the cheapest network to connect all of a given set of vertices Kruskal'salgorithm on a graph of distances between 128 North American cities

  32. Example

  33. MST-Kruskal

  34. MST-Kruskal

  35. MST-Kruskal Running Time:

  36. MST-Prim-Jarnik

  37. Example

  38. MST-Prim-Jarnik

  39. MST-Prim

  40. MST-Prim

  41. MST-Prim

  42. Decrease_key(v,x) We use a min-Heapto hold the edges in G-T How can we implement Decrease key(v,x)? Simple solution: • Change value for v • Follow strategy for Heap_insertfrom v upwards • Cost: O(lgV)

  43. MST-Prim Running Time:

  44. Does a linear-time MST algorithm exist?

  45. Euclidean MST Given N points in the plane, find MST connecting them, where the distances between point pairs are their Euclidean distances. Brute force. Compute ~ /2 distances and run Prim's algorithm. Ingenuity. Exploit geometry and do it in ~ c N lg N.

  46. Scientific application: clustering k-clustering. Divide a set of objects classify into k coherent groups. Distance function. Numeric value specifying "closeness" of two objects. Goal. Divide into clusters so that objects in different clusters are far apart. Applications. • Routing in mobile ad hoc networks. • Document categorization for web search. • Similarity searching in medical image databases. • Skycat: cluster 109 sky objects into stars, quasars, galaxies. outbreak of cholera deaths in London in 1850s (Nina Mishra)

  47. Single-link clustering k-clustering. Divide a set of objects classify into k coherent groups. Distance function. Numeric value specifying "closeness" of two objects. Goal. Divide into clusters so that objects in different clusters are far apart. Single link. Distance between two clusters equals the distance between the two closest objects (one in each cluster). Single-link clustering. Given an integer k, find a k-clustering that maximizes the distance between two closest clusters.

  48. Single-link clustering algorithm “Well-known” algorithm for single-link clustering: • Form V clusters of one object each. • Find the closest pair of objects such that each object is in a different cluster, and merge the two clusters. • Repeat until there are exactly k clusters. Observation. This is Kruskal'salgorithm (stop when k connected components). Alternate solution. Run Prim's algorithm and delete k-1 max weight edges.

More Related