1 / 56

CS 3343: Analysis of Algorithms

Learn about graph variations, representing graphs with matrices and lists, and solving problems like the Minimum Spanning Tree. Dive into practical applications and different graph properties.

genevar
Download Presentation

CS 3343: Analysis of Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 3343: Analysis of Algorithms Introduction to Graphs

  2. 0 0 Uniform-profit Restaurant location problem Goal: maximize number of restaurants open Subject to: distance constraint (min-separation >= 10) 5 2 2 6 6 3 6 10 7 6 9 15 7 d 5 7 9 15 10 0 0

  3. Events scheduling problem Goal: maximize number of non-conflicting events e6 e8 e3 e7 e4 e5 e9 e1 e2 Time

  4. item Weight (LB) Value ($) $ / LB 1 2 2 1 2 4 3 0.75 3 3 3 1 4 5 6 1.2 5 2 4 2 6 6 9 1.5 Fractional knapsack problem • Goal: maximize value without exceeding bag capacity • Weight limit: 10LB

  5. Example • Goal: maximize value without exceeding bag capacity • Weight limit: 10LB • 2 + 6 + 2 = 10 LB • 4 + 9 + 1.2*2 = 15.4

  6. The remaining lectures • Graph algorithms • Very important in practice • Tons of computational problems can be defined in terms of graphs • We’ll study a few interesting ones • Minimum spanning tree • Shortest path • Graph search • Topological sort, connected components

  7. Graphs • A graph G = (V, E) • V = set of vertices • E = set of edges = subset of V  V • Thus |E| = O(|V|2) 1 Vertices: {1, 2, 3, 4} Edges: {(1, 2), (2, 3), (1, 3), (4, 3)} 2 4 3

  8. Graph Variations (1) • Directed / undirected: • In an undirected graph: • Edge (u,v)  E implies edge (v,u)  E • Road networks between cities • In a directed graph: • Edge (u,v): uv does not imply vu • Street networks in downtown • Degree of vertex v: • The number of edges adjacency to v • For directed graph, there are in-degree and out-degree

  9. 1 1 2 4 2 4 3 3 In-degree = 3 Out-degree = 0 Degree = 3 Directed Undirected

  10. Graph Variations (2) • Weighted / unweighted: • In a weighted graph, each edge or vertex has an associated weight (numerical value) • E.g., a road map: edges might be weighted w/ distance 1 1 0.3 2 4 2 4 1.2 0.4 1.9 3 3 Weighted Unweighted

  11. Graph Variations (3) • Connected / disconnected: • A connected graphhas a path from every vertex to every other • A directed graph is strongly connectedif there is a directed path between any two vertices 1 2 4 Connected but not strongly connected 3

  12. Graph Variations (4) • Dense / sparse: • Graphs are sparsewhen the number of edges is linear to the number of vertices • |E|  O(|V|) • Graphs are densewhen the number of edges is quadratic to the number of vertices • |E|  O(|V|2) • Most graphs of interest are sparse • If you know you are dealing with dense or sparse graphs, different data structures may make sense

  13. Representing Graphs • Assume V = {1, 2, …, n} • An adjacency matrixrepresents the graph as a n x n matrix A: • A[i, j] = 1 if edge (i, j)  E = 0 if edge (i, j)  E • For weighted graph • A[i, j] = wij if edge (i, j)  E = 0 if edge (i, j)  E • For undirected graph • Matrix is symmetric: A[i, j] = A[j, i]

  14. Graphs: Adjacency Matrix • Example: 1 2 4 3

  15. Graphs: Adjacency Matrix • Example: 1 2 4 3 How much storage does the adjacency matrix require? A: O(V2)

  16. 0 1 1 0 1 0 1 0 1 1 0 1 0 0 1 0 Graphs: Adjacency Matrix • Example: A 1 2 3 4 1 1 2 2 4 3 3 4 Undirected graph

  17. 0 5 6 0 5 0 9 0 6 9 0 4 0 0 4 0 Graphs: Adjacency Matrix • Example: A 1 2 3 4 1 1 5 2 6 2 4 3 9 4 3 4 Weighted graph

  18. Graphs: Adjacency Matrix • Time to answer if there is an edge between vertex u and v: Θ(1) • Memory required: Θ(n2) regardless of |E| • Usually too much storage for large graphs • But can be very efficient for small graphs • Most large interesting graphs are sparse • E.g., road networks (due to limit on junctions) • For this reason the adjacency list is often a more appropriate representation

  19. Graphs: Adjacency List • Adjacency list: for each vertex v  V, store a list of vertices adjacent to v • Example: • Adj[1] = {2,3} • Adj[2] = {3} • Adj[3] = {} • Adj[4] = {3} • Variation: can also keep a list of edges coming into vertex 1 2 4 3

  20. Graph representations • Adjacency list 1 2 3 3 2 4 3 3 How much storage does the adjacency list require? A: O(V+E)

  21. A 1 2 3 4 1 0 1 1 0 2 1 0 1 0 3 1 1 0 1 4 0 0 1 0 Graph representations • Undirected graph 1 2 4 3 2 3 1 3 1 2 4 3

  22. A 1 2 3 4 1 0 5 6 0 2 5 0 9 0 3 6 9 0 4 4 0 0 4 0 Graph representations • Weighted graph 1 5 6 2 4 9 4 3 2,5 3,6 1,5 3,9 1,6 2,9 4,4 3,4

  23. Graphs: Adjacency List • How much storage is required? • For directed graphs • |adj[v]| = out-degree(v) • Total # of items in adjacency lists is  out-degree(v) = |E| • For undirected graphs • |adj[v]| = degree(v) • # items in adjacency lists is degree(v) = 2 |E| • So: Adjacency lists take (V+E) storage • Time needed to test if edge (u, v)  E is O(n)

  24. Tradeoffs between the two representations |V| = n, |E| = m Both representations are very useful and have different properties.

  25. 6 4 5 9 14 2 10 15 3 8 Minimum Spanning Tree • Problem: given a connected, undirected, weighted graph:

  26. Minimum Spanning Tree • Problem: given a connected, undirected, weighted graph, find a spanning tree using edges that minimize the total weight 6 4 5 9 • A spanning tree is a tree that connects all vertices • Number of edges = ? • A spanning tree has no designated root. 14 2 10 15 3 8

  27. How to find MST? • Connect every node to the closest node? • Does not guarantee a spanning tree

  28. T2 T1 T1’ v u Minimum Spanning Tree • MSTs satisfy the optimal substructure property: an optimal tree is composed of optimal subtrees • Let T be an MST of G with an edge (u,v) in the middle • Removing (u,v) partitions T into two trees T1 and T2 • w(T) = w(u,v) + w(T1) + w(T2) • Claim 1:T1 is an MST of G1 = (V1, E1), and T2 is an MST of G2 = (V2, E2) • Proof by contradiction: • if T1 is not optimal, we can replace T1 with a better spanning tree, T1’ • T1’, T2 and (u, v) form a new spanning tree T’ • W(T’) < W(T). Contradiction.

  29. y x Minimum Spanning Tree • MSTs satisfy the optimal substructure property: an optimal tree is composed of optimal subtrees • Let T be an MST of G with an edge (u,v) in the middle • Removing (u,v) partitions T into two trees T1 and T2 • w(T) = w(u,v) + w(T1) + w(T2) • Claim 2:(u, v) is the lightest edge connecting G1 = (V1, E1) and G2 = (V2, E2) • Proof by contradiction: • if (u, v) is not the lightest edge, we can remove it, and reconnect T1 and T2 with a lighter edge (x, y) • T1, T2 and (x, y) form a new spanning tree T’ • W(T’) < W(T). Contradiction. T2 T1 v u

  30. Algorithms • Generic idea: • Compute MSTs for sub-graphs • Connect two MSTs for sub-graphs with the lightest edge • Two of the most well-known algorithms • Prim’s algorithm • Kruskal’s algorithm • Let’s first talk about the ideas behind the algorithms without worrying about the implementation and analysis

  31. Prim’s algorithm • Basic idea: • Start from an arbitrary single node • A MST for a single node has no edge • Gradually build up a single larger and larger MST 6 5 Not yet discovered 7 Fully explored nodes Discovered but not fully explored nodes

  32. Prim’s algorithm • Basic idea: • Start from an arbitrary single node • A MST for a single node has no edge • Gradually build up a single larger and larger MST 2 6 5 9 Not yet discovered 4 7 Fully explored nodes Discovered but not fully explored nodes

  33. Prim’s algorithm • Basic idea: • Start from an arbitrary single node • A MST for a single node has no edge • Gradually build up a single larger and larger MST 2 6 5 9 4 7

  34. Prim’s algorithm in words • Randomly pick a vertex as the initial tree T • Gradually expand into a MST: • For each vertex that is not in T but directly connected to some nodes in T • Compute its minimum distance to any vertex in T • Select the vertex that is closest to T • Add it to T

  35. Example a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  36. Example a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  37. Example a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  38. Example a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  39. Example a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  40. Example a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  41. Example a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  42. Example a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d Total weight = 3 + 8 + 6 + 5 + 7 + 9 + 15 = 53

  43. Kruskal’s algorithm • Basic idea: • Grow many small trees • Find two trees that are closest (i.e., connected with the lightest edge), join them with the lightest edge • Terminate when a single tree forms

  44. Claim • If edge (u, v) is the lightest among all edges, (u, v) is in a MST • Proof by contradiction: • Suppose that (u, v) is not in any MST • Given a MST T, if we connect (u, v), we create a cycle • Remove an edge in the cycle, have a new tree T’ • W(T’) < W(T) By the same argument, the second, third, …, lightest edges, if they do not create a cycle, must be in MST v u

  45. Kruskal’s algorithm in words • Procedure: • Sort all edges into non-decreasing order • Initially each node is in its own tree • For each edge in the sorted list • If the edge connects two separate trees, then • join the two trees together with that edge

  46. Example c-d: 3 b-f: 5 b-a: 6 f-e: 7 b-d: 8 f-g: 9 d-e: 10 a-f: 12 b-c: 14 e-h: 15 a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  47. Example c-d: 3 b-f: 5 b-a: 6 f-e: 7 b-d: 8 f-g: 9 d-e: 10 a-f: 12 b-c: 14 e-h: 15 a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  48. Example c-d: 3 b-f: 5 b-a: 6 f-e: 7 b-d: 8 f-g: 9 d-e: 10 a-f: 12 b-c: 14 e-h: 15 a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  49. Example c-d: 3 b-f: 5 b-a: 6 f-e: 7 b-d: 8 f-g: 9 d-e: 10 a-f: 12 b-c: 14 e-h: 15 a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

  50. Example c-d: 3 b-f: 5 b-a: 6 f-e: 7 b-d: 8 f-g: 9 d-e: 10 a-f: 12 b-c: 14 e-h: 15 a 6 12 9 5 b f g 7 14 15 8 c e h 10 3 d

More Related