520 likes | 761 Views
Data Structure & Algorithm. 11 – Minimal Spanning Tree JJCAO. Steal some from Prof. Yoram Moses & Princeton COS 226 . Weighted Graphs. G =(V,E), wt wt : E → R wt (G) = . Sub-Graphs. Note: G' is not a spanning sub-graph of G. Minimum Spanning Tree. A Subgraph A tree Spans G
E N D
Data Structure & Algorithm 11 – Minimal Spanning Tree JJCAO Steal some from Prof. YoramMoses & Princeton COS 226
Weighted Graphs G =(V,E),wt wt: E → R wt(G) =
Sub-Graphs Note: G' is not a spanning sub-graph of G
Minimum Spanning Tree • A Subgraph • A tree • Spans G • Of minimal weight
MST Origin OtakarBoruvka (1926). • Electrical Power Company of Western Moravia in Brno. • Most economical construction of electrical power network. • Concrete engineering problem is now a cornerstone problem in combinatorial optimization.
MST describes arrangement of nuclei in the epithelium for cancer research http://www.bccrc.ca/ci/ta01_archlevel.html
Normal Consistency • [Hoppe et al. 1992] • Based on angles between unsigned normals • May produce errors on close-by surface sheets
MST is fundamental problem with diverse applications • Network design. • telephone, electrical, hydraulic, TV cable, computer, road • Approximation algorithms for NP-hard problems. • traveling salesperson problem, Steiner tree • Indirect applications. • max bottleneck paths • LDPC codes for error correction • image registration with Renyi entropy • learning salient features for real-time face verification • reducing data storage in sequencing amino acids in a protein • model locality of particle interactions in turbulent fluid flows • autoconfigprotocol for Ethernet bridging to avoid cycles in a network • Cluster analysis.
Minimum Spanning Tree Input: a connected, undirected graph - G, with a weight function on the edges – wt Goal: find a Minimum-weight Spanning Tree for G Fact: If all edge weights are distinct, the MST is unique Brute force: Try all possible spanning trees • problem 1: not so easy to implement • problem 2: far too many of them Ex: [Cayley, 1889]: V^{V-2} spanning trees on the complete graph on V vertices.
Main algorithms of MST • Kruskal’salgorithm • Prim’s algorithm Both O(ElgV) using ordinary binary heaps Both greedy algorithms => Global solution • …
Two Greedy Algorithms • Kruskal's algorithm. Consider edges in ascending order of cost. Add the next edge to T unless doing so would create a cycle. • Prim's algorithm. Start with any vertex s and greedily grow a tree T from s. At each step, add the cheapest edge to T that has exactly one endpoint in T. Greed is good. Greed is right. Greed works. Greed clarifies, cuts through, and captures the essence of the evolutionary spirit." - Gordon Gecko
Cycle Property • Let T be a minimum spanning tree of a weighted graph G • Let e be an edge of G that is not in T and C be the cycle formed by e with T • For every edge f of C, weight(f) ≤ weight(e) Proof: • By contradiction • If weight(f) > weight(e) we can get a spanning tree of smaller weight by replacing e with f
Cut (/Partition) Property Lemma: Let G =(V,E) and X ⊂ V. If e = a lightest edge connecting X and V-X then e appears in some MST of G. Proof: • Let T be an MST of G • If T does not contain e, consider the cycle C formed by e with T and let f be an edge of C across the partition • By the cycle property, weight(f) ≤ weight(e) • Thus, weight(f) = weight(e) • We obtain another MST by replacing f with e globally optimal solution (MST) locally optimal choice (of lightest edges)
Worst-Case Example • n: the number of MAKE-SET operations, • m: the total number of MAKE-SET, UNION, and FIND-SET operations • we can easily construct a sequence of m operations on n objects that requires (n^2) time
Weighted Union Heuristic • Each set id includes the length of the list • In Union - append shorter list at end of longer Theorem: Performing m > n operations takes O(m + nlgn) time
Simple Forest Implementation Find-Set(x) - follow pointers from x up to root Union(c,f) - make c a child of f and return f ∪
Worst-Case Example n … 3 2 1
Weighted Union Heuristic • Each node includes a weight field weight =# elements in sub-tree rooted at node • Find-Set(x) - as before O(depth(x)) • Union(x,y) - always attach smallertree below the rootof largertree O(1)
Weighted Union Theorem: Any k-node tree created using the weighted-union heuristic, has height ≤ lg(k) Proof: By induction on k Find-Set Running Time: O(lgn)
The function lgn lg n = the number of times we have to take the log2n repeatedly to reach root node Lg 2 = 1 Lg 2^2 = 2 Lg 2^16 = lg 65536 = 16 => Lg n < 16 for all practical values of n
Theorem(Tarjan): If S= a sequence of O(n) Unions and Find-Sets The worst-case time for S with – Weighted Unions, and – Path Compressions is O(nlgn) The average time is O(lgn) per operation in Linked List Implementation
Theorem(Tarjan): Let S= a sequence of O(n) Unions and Find-Sets The worst-case time for S with – Weighted Unions, and – Path Compressions is O(nα(n)) The average time is O(α(n)) per operation, α(n) < 5 in practice
Connected Components usingUnion-Find Reminder: • Every node v is connected to itself • if u and v are in the same connected component then v is connected to u and u is connected to v • Connected components form a partition of the nodes and so are disjoint:
MST-Kruskal Kruskal's algorithm for minimum spanning tree works by inserting edges in order of increasing cost, adding as edges to the tree those which connect two previously disjoint components. The minimum spanning tree describes the cheapest network to connect all of a given set of vertices Kruskal'salgorithm on a graph of distances between 128 North American cities
MST-Kruskal Running Time:
Decrease_key(v,x) We use a min-Heapto hold the edges in G-T How can we implement Decrease key(v,x)? Simple solution: • Change value for v • Follow strategy for Heap_insertfrom v upwards • Cost: O(lgV)
MST-Prim Running Time:
Euclidean MST Given N points in the plane, find MST connecting them, where the distances between point pairs are their Euclidean distances. Brute force. Compute ~ /2 distances and run Prim's algorithm. Ingenuity. Exploit geometry and do it in ~ c N lg N.
Scientific application: clustering k-clustering. Divide a set of objects classify into k coherent groups. Distance function. Numeric value specifying "closeness" of two objects. Goal. Divide into clusters so that objects in different clusters are far apart. Applications. • Routing in mobile ad hoc networks. • Document categorization for web search. • Similarity searching in medical image databases. • Skycat: cluster 109 sky objects into stars, quasars, galaxies. outbreak of cholera deaths in London in 1850s (Nina Mishra)
Single-link clustering k-clustering. Divide a set of objects classify into k coherent groups. Distance function. Numeric value specifying "closeness" of two objects. Goal. Divide into clusters so that objects in different clusters are far apart. Single link. Distance between two clusters equals the distance between the two closest objects (one in each cluster). Single-link clustering. Given an integer k, find a k-clustering that maximizes the distance between two closest clusters.
Single-link clustering algorithm “Well-known” algorithm for single-link clustering: • Form V clusters of one object each. • Find the closest pair of objects such that each object is in a different cluster, and merge the two clusters. • Repeat until there are exactly k clusters. Observation. This is Kruskal'salgorithm (stop when k connected components). Alternate solution. Run Prim's algorithm and delete k-1 max weight edges.