1 / 34

Data Strcutures

Data Strcutures. Part 4. Graph. A graph G consists of two things. Set v of elements, called nodes (or vertices) Set E of edges/ arcs. G = (V, E). Terminologies. Nodes: Nodes are the objects that are connected in a graph.

ronia
Download Presentation

Data Strcutures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Strcutures Part 4

  2. Graph • A graph G consists of two things. • Set v of elements, called nodes (or vertices) • Set E of edges/ arcs. • G = (V, E)

  3. Terminologies • Nodes: Nodes are the objects that are connected in a graph. • Arc:A connection between one node and another node. Each arc in a graph is specified by a pair of nodes. • Set of nodes = {A, B, C, D, E, F, G, H} • Set of Arcs = {(A, B), (A, C), (A, D), (C, F), (E, G), (A, A)} • End Points: A and B are end points of arc (A, B).

  4. Terminologies (Contd.) • Adjacent Nodes/Neighbors:A and B are neighbors. If we are considering a node ‘A’, then all the nodes connected to this node ‘A’ are the neighbors of ‘A’. • Degree:The numbers of arcs connected to a node in an undirected graph is called node’s degree, written as deg(u). In example, deg(A) = 3. • Path:A sequence of arcs starting at one node and terminating on another node. • Length: The number of arcs traversed in a path. • Closed Path: An edge having identical end points / An arc starting at a node and terminating on the same node. • Simple Path: A sequence of arcs in which all the nodes are distinct.

  5. Terminologies (Contd.) • Cycle: A path (of length 3 or more) from a node to itself / A path of length 3 or more with the same starting and ending nodes and where no other node in the path is visited more than once. • Connected Graph: A graph is said to be connected if there is a path between any two of its nodes. • Complete Graph: A graph G is said to be complete if every node ‘u’ in G is adjacent to every other node ‘v’ in G. • A complete graph with ‘n’ nodes will have ‘n(n-1)/2’ edges. • Tree Graph: A connected graph without any cycles is called a tree graph or a free tree Or A free tree is a connected, acyclic, undirected graph.

  6. Terminologies (Contd.) • Weighted Graph: A graph is said to be weighted if each edge in G is assigned a non-negative numerical value ‘w(e)’ called the weight or length of ‘e’. • Multi-graph: It consists of two things. • Multiple Edges:(parallel edges) Distinct edges are called multiple edges, if they connect the same end points. For example, if e = [u,v], é = [u,v] • Loops:An edge is called a loop if it has identical end points. For example, e = [u,u]

  7. Directed Graph/Digraph • If the arcs are directed (indicated with an arrow at one end of the arc), the graph is called a digraph. Each edge ‘e’ is identified with an ordered pair <u,v> of nodes in G. • Edge ‘e’ begins at node ‘u’ and ends at node ‘v’. • ‘u’ is the origin of ‘e’, and ‘v’ is the destination. • ‘u’ is predecessor of ‘v’ and ‘v’ is a successor or neighbor of ‘u’. • ‘u’ is adjacent to ‘v’ and ‘v’ is adjacent to ‘u’. 1 2

  8. Directed Graph/Digraph (Contd.) • Incident: A node ‘u’ is incident to an edge ‘e’ if ‘u’ is one of the two nodes in the ordered pair of nodes that constitute ‘e’. • Outdegree: Number of edges beginning at a node (e.g. in fig. 2, outdegree(D) = 2) • Indegree: Number of edges entering a node (e.g. in fig. 2, indegree(D) = 1) • Source: Positive outdegree, zero indegree • Sink: Zero outdegree, positive indegree (e.g. node ‘C’ in fig. 2)

  9. Strongly Connected Directed Graph • A graph is strongly connected if there is a path from node ‘u’ to node ‘v’ and also a path from node ‘v’ to ‘u’. • The graph in fig. 2 is not strongly connected as there is no path from C to any other node. Unilaterally Connected Directed Graph • A graph is unilaterally connected if there is a path from node ‘u’ to node ‘v’ or a path from ‘v’ to ‘u’. Simple Directed Graph • A directed graph is said to be simple if G has no parallel edges. A simple graph may have loops, but cannot have more than one loop at a given node. Rooted Tree • A rooted tree is a tree in which one node is called the root and is the starting point for reaching all other nodes.

  10. Representation of graphs in memory • There are two standard ways of representing graphs in memory. • Sequential representation by means of adjacency matrix • Linked representation by means of linked lists

  11. Adjacency Matrix • Let G be a graph with ‘n’ vertices where ‘n > 0’. • Let V(G) = {v1, v2, ……, vn} • The adjacency matrix A is a two dimensional n×n matrix such that (i, j)th entry of A is 1 if there is a path from vi to vj, zero otherwise. • Data: X, Y, Z, W A = 0 0 0 1 1 0 1 1 1 0 0 1 0 0 1 0 • The number of 1’s is equal to the number of edges.

  12. Linked Representation of a Graph • Adjacency matrix representation has many drawbacks. • Insertion and deletion is difficult, because size of A may need to be changed and the nodes may need to be reordered. • It will contain many zeros, a great deal of space will be wasted. • The linked representation will contain two lists, a node list NODE and an adjacency list ADJ. • Node List: Each element in this list corresponds to a node in G. • NODE: Info in the node • NEXT: Pointer to the next node in the list • ARC: Pointer to first element in adjacency list of node NODE NEXT ARC

  13. Linked Representation of a Graph (Contd.) • The nodes will have a pointer variable START for the beginning of the list. Node Adjacency List A B, C, D B C C D C, E E C

  14. Linked Representation of a Graph (Contd.) • Adjacency List: Each element in this list will correspond to an endge of E.

  15. Searching a Node FIND (INFO, LINK START, ITEM, LOC) [Algo. 5.2] Finds the location LOC of the first node containing ITEM, or sets LOC = NULL. • Set PTR = START. • Repeat while PTR ≠ NULL: If ITEM = ONFO[PTR], then: Set LOC = PTR and Return. Else: Set PTR = LINK[PTR]. • Set LOC = NULL and Return.

  16. Finding location of an edge FINDEDGE (NODE, NEXT, ADJ, START, DEST, LINK, A, B, LOC) This procedure finds the location LOC of an edge (A, B) in the graph G, or sets LOC = NULL. • Call FIND (NODE, NEXT, START, A, LOCA) • Call FIND (NODE, NEXT, START, B, LOCB) • If LOCA = NULL or LOCB = NULL, then: Set LOC = NULL. Else: Call FIND (DEST, LINK, ADJ[LOCA], LOCB, LOC) • Return.

  17. Traversing a Graph • Many graph algorithms require one to systematically examine the nodes and edges of a graph G. there are two standard ways that this is done. • One way is called a breadth-first search, and the other is called a depth-first search. • The breadth-first search will use a queue as an auxiliary structure to hold nodes for future processing, and analogously, the depth-first search will use a stack. • During the execution of our algorithms, each node N of G will be in one of three states, called the status of N, as follows: • Status = 1: (Ready state) The initial state of the node N. • Status = 2: (Waiting state) The node N is on the queue or stack, waiting to be processed. • Status = 3: (Processed state) The node N has been processed.

  18. Breadth-First Search • Here, we first examine the starting node A. then we examine all the neighbors of A. then we examine all the neighbors of the neighbors of A. and so on. • We need to keep track of the neighbors of a node, and we need to guarantee that no node is processed more than once. • This is accomplished by using a queue to hold nodes that are waiting to be processed, and by using a field STATUS which tells us the current status of any node.

  19. Breadth-First Search Algorithm • This algorithm executes a breadth-first search on a graph G beginning at a start node A. • Initialize all nodes to the ready state (STATUS = 1). • Put the starting node A in QUEUE and change its status to the waiting state (STATUS = 2). • Repeat steps 4 and 5 until QUEUE is empty: • Remove the front node N of QUEUE. Process N and change the status of N to the processed state (STATUS = 3). • Add to the rear of QUEUE all the neighbors of N that are in the ready state (STATUS = 1), and change their status to the waiting state (STATUS = 2). [End of step 3 loop] • Exit.

  20. Depth-First Search • Here, we first examine the starting node A. then we examine each node N along a path P which begins at A; that is, we process a neighbor of A, then a neighbor of a neighbor of A. and so on. After coming to a “dead end”, that is, to the end of path P, we backtrack on P until we can continue along another path P`. And so on. • Here we will use stack instead of queue. Also, a field STATUS is used to tell us the current status of a node.

  21. Depth-First Search (Algorithm) • This algorithm executes a depth-first search on a graph G beginning at a start node A. • Initialize all nodes to the ready state (STATUS = 1). • Push the starting node A onto STACK and change its status to the waiting state (STATUS = 2). • Repeat steps 4 and 5 until STACK is empty: • Pop the top nose N of STACK. Process N and change the status of N to the processed state (STATUS = 3). • Push onto STACK all the neighbors of N that are in the ready state (STATUS = 1), and change their status to the waiting state (STATUS = 2). [End of step 3 loop] • Exit.

  22. Example

  23. Sorting and Searching

  24. Insertion Sort • Suppose an array A with n elements A[1], A[2], ……., A[N] is in memory. The insertion sort algorithm scans A from A[1] to A[N], inserting each element A[K] into its proper position in the previously sorted subarray A[1], A[2], …., A[K-1]. That is: • Pass 1: A[1] by itself is trivially sorted. • Pass 2: A[2] is inserted either before or after A[1] so that: A[1], A[2] is sorted. • Pass 3: A[3] is inserted into its proper place in A[1], A[2], that is, before A[1], between A[1] and A[2], or after A[2], so that: A[1], A[2], A[3] is sorted. • Pass 4: A[4] is inserted into its proper place in A[1], A[2], A[3] so that: A[1], A[2], A[3], A[4] is sorted. • ………………………………………………………………………………………………………………. • Pass N: A[N] is inserted into its proper place in A[1], A[2], ….., A[N-1] so that: A[1], A[2], ……., A[N] is sorted. • This sorting algorithm is frequently used when n is small.

  25. Insertion Sort (Contd.)

  26. Insertion Sort (Contd.) INSERTION (A, N) This algorithm sorts the array A with N elements. • Set A[0] = -∞. [Initializes sentinel element] • Repeat steps 3 to 5 for K = 2, 3, ……., N. • Set TEMP = A[K] and PTR = K – 1. • Repeat while TEMP < A[PTR]: • Set A[PTR + 1] = A[PTR] [Moves element forward] • Set PTR = PTR – 1. [End of loop] • Set [A[PTR + 1] = TEMP [Inserts element in proper place] [End of step 2 loop] • Return

  27. Selection Sort • Suppose an array A with n elements A[1], A[2], ……, A[N] is in memory. The selection sort algorithm for sorting A works as follows. First find the smallest element in the list and put it in the first position. Then find the second smallest element in the list and put it in the second position. And so on. More precisely, • Pass 1: Find the location LOC of the smallest in the list of N elements A[1], A[2], ……, A[N], and then interchange A[LOC] and A[1]. Then: A[1] is sorted. • Pass 2: Find the location LOC of the smallest in the sublist of N-1 elements A[2], A[3], ……, A[N], and then interchange A[LOC] and A[2]. Then: A[1], A[2] is sorted, since A[1] ≤ A[2]. • ………………………………………………………………………………………………………………. • Pass N-1: Find the location LOC of the smaller of the elements A[N-1], A[N], and then interchange A[LOC] and A[N-1]. Then: A[1], A[2], ….., A[N] is sorted, since A[N-1] ≤ A[N]. • Thus A is sorted after N-1 passes.

  28. Selection Sort (Contd.) • Suppose an array contains 8 elements as follows: 77, 33, 44, 11, 88, 22, 66, 55

  29. Selection Sort (Contd.) MIN (A,K, N, LOC) An array A is in memory. This procedure finds the location LOC of the smallest element among A[K], A[K+1], ……, A[N]. • Set MIN = A[K] and LOC = K. [Initialize pointers] • Repeat for J = K+1, K+2, ….., N: If MIN > A[J], then: Set MIN = A[J] and LOC = J. [End of loop] • Return.

  30. Selection Sort (Contd.) (Selection Sort) SELECTION (A, N) This algorithm sorts the array A with N elements. • Repeat steps 2 and 3 for K = 1, 2, …., N-1. • Call MIN (A, K, N, LOC). • [Interchange A[K] and A[LOC]] Set TEMP = A[K], A[K] = A[LOC] and A[LOC] = TEMP. [End of step 1 loop] • Exit

  31. Merging • Suppose A is a sorted list with r elements and B is a sorted list with s elements. The operation that combines the elements of A and B into a single sorted list C with n = r + s elements is called merging.

  32. Merge Sort MERGE (A, p, q, r) n1 = q – p + 1 n2 = r – q Create arrays L[1…..n1 + 1] and R[1…..n2 + 1] For i = 1 to n1 do L[i] = A[p + i – 1] For j = 1 to n2 do R[j] = A[q + j] L[n1 + 1] = ∞ R[n2 + 1] = ∞ i = 1 j = 1 For k = p to r do if L[i] ≤ R[j] then A[k] = L[i] i = i + 1 else A[k] = R[j] j = j + 1

  33. Merge Sort (Contd.) MERGE-SORT (A, p, r) If p < r then q = [(p + r)/2] MERGE-SORT (A, p, q) MERGE-SORT (A, q + 1, r) MERGE (A, p, q, r)

  34. The End

More Related