370 likes | 392 Views
Binary Heap. Binary Heap. Note: this presentation only discusses maximum binary heaps. 25. 20. 13. 18. 8. 12. 7. 11. 17. 6. Binary Heap. A binary heap is a binary tree with a key in each node such that: All leaves are on, at most, two adjacent levels.
E N D
Binary Heap • Note: this presentation only discusses maximum binary heaps 25 20 13 18 8 12 7 11 17 6
Binary Heap • A binary heap is a binary tree with a key in each node such that: • All leaves are on, at most, two adjacent levels. • All leaves on the lowest level occur to the left, and all levels except the lowest one are completely filled. • The key in the root is greater or equal to all its children, and the left and right sub trees are again binary heaps.
Binary Heap • A heap will often be implementedusing an array • This implementation requires an additional field, the heap size,which is at most the array size • Enables quick and easy access toa node's parent, and to its left andright sons
Properties of the heap • The largest value of the heap is always at the root • The maximal height of a heap is log(n), where n is the number of elements • The items along every path from the root to a leaf are ordered, from larger to smaller
Direct Access to Parent and to Children • If a node is at index n, then its parent will be indexed n / 2 • If a node is at index n, then its left son would be indexed at 2n • If a node is at index n, then its right son would be indexed at 2n + 1
Direct Access to Parent and to Children – Implementation Parent (i) return i / 2 Left (i) return 2i Right (i) return 2i + 1
1 25 20 2 13 3 18 4 8 5 12 6 7 7 11 8 17 9 10 6 25 20 13 18 8 12 7 11 17 6 Binary Heap 1 2 3 4 7 5 6 8 9 10
Perc-Down Perc-Down (A, i) l Left(i) r Right(i) if l ≤ heap-size[A] and A[l] > A[i] thenlargest = l elselargest = i if r ≤ heap-size[A] and A[r] > A[largest] thenlargest = r iflargest ≠ i then exchange A[i] A[largest] Perc-Down (A, largest)
Perc-Down (A, 1) 8 25 13 18 20 12 7 11 17 6
The Heap Property • The ancestor relation defines a partial order on the heap elements: • Reflexive: x is an ancestor of itself • Anti-symmetric: if x is an ancestor of y and y is an ancestor of x, then x = y • Transitive: if x is an ancestor of y and y is an ancestor of z, x is an ancestor of z
The Heap Property • The partial order defined by the heap structure is weaker than that of the total order, therefore: • A heap is easier to build and maintain than a sorted data structure • A heap is less powerful than a sorted data structure
Constructing a Heap • Heaps can be constructed incrementally, by inserting new elements into the left-most openspot in the array • If the new element is greater than its parent, swap their positions and recur (percolate up)
Constructing a Heap • Correctness • Since at each step, we replace the root of a sub tree by a larger one, we preserve the heap property • Complexity • Doing n such insertions takes O(n log n) time, since each of the last n/2 insertions requires O(log n) time each
Constructing a Heap • Another way to construct a heap (assuming that all items are known in advance), is using Build-Heap: Build-Heap (A) heap-size[A] length[A] for i length[A] / 2 down to 1 do Perc-Down (A, i)
4 1 3 2 16 9 10 14 8 7 Build-Heap {4,1,3,2,16,9,10,14,8,7} 4 1 3 2 16 9 10 14 8 7
4 1 3 14 16 9 10 2 8 7 Build-Heap {4,1,3,2,16,9,10,14,8,7} 4 1 3 2 16 9 10 14 8 7
4 1 10 14 16 9 3 2 8 7 Build-Heap {4,1,3,2,16,9,10,14,8,7} 4 1 3 14 16 9 10 2 8 7
4 16 10 14 7 9 3 2 8 1 Build-Heap {4,1,3,2,16,9,10,14,8,7} 4 1 10 14 16 9 3 2 8 7
16 14 10 8 7 9 3 2 4 1 Build-Heap {4,1,3,2,16,9,10,14,8,7} 4 16 10 14 7 9 3 2 8 1
Constructing a Heap • The Build-Heap algorithm can be seen as a merge algorithm – wheneverPerc-Down is called for a node, both its sub-trees are already heaps • Rough Analysis of build heap produces a running time of O(n log n) • However careful analysis will show that in fact it is a linear algorithm
Build-Heap Analysis • A heap of size n has nodes at height h • So there are n/2 nodes which are leaves (height 0), n/4 nodes of height 1 and finally a single node of height
Build-Heap Analysis • Perc-Down runs faster for lower nodes: at height h, runs in O(h) time • Since most nodes are low,most runs of Perc-Down are fast • The total running time of Build-heap is:
Build-Heap Analysis • From the formula we assign x = ½ • And finally:
Priority Queue • A priority queue is a data structure that supports the following operations: • Insert (S, x) – insert x into set S • Find-Max (S) – return the largest key in S • Delete-Max (S) – return the largest key in S, and remove it • These operations can be supported naturally using a binary heap
Find-Max • The largest element in the heap is the root, therefore this method simply returns A[1]
Delete-Max • Replace last leaf with root, decrease heap size by 1, and Perc-Down fromthe root: Delete-Max (A) max A[1] A[1] A[heap-size[A]] heap-size[A] heap-size[A] – 1 Perc-Down (A,1) returnmax
Insert (A, key) Insert (A, key) heap-size[A] heap-size[A] + 1 i heap-size[A] while (i > 1 and A[Parent(i)] < key) do A[i] A[Parent(i)] i Parent(i) A[i] key
Question 1 • Insert the element 101 into the heap {100, 15, 99, 7, 3, 98, 1, 4, 5, 1, 2, 70} • Delete the maximum element from the resulting heap
Question 2 • Given an array-based binary heap of size n, suggest an algorithm for each of the following operations: • Finding the second largest element • Finding the third largest element • What is the complexity of each of these algorithms?
Question 2 – Finding the Second Largest Element • The second largest element in the heap must be one of the two sons of the root • If it isn't, the heap property of one of the sub-trees will be violated • This can be found in constant time • The algorithm: return max (A[2], A[3])
Question 2 – Finding the Third Largest Element • The third largest element is either the smaller son of the root, or one of the sons of the greater son: if (A[2] > A[3]) return max (A[3], A[4], A[5]) elsereturn max (A[2], A[6], A[7]) • A simpler solution – traverse the first 7 elements, and return the third largest
Question 2 – Complexity • In both cases, the number of operations performed by the algorithm is constant • Therefore: O(1)
Question 3 – From Exam • An array-based binary heap of size nis given • The key of the root is log n • The key of each node is smaller than its parent's key by 1 • What is the sum of all keys in the tree? • O(log n), O(n) , O(nlog n), O(n2)?
Question 3 – From Exam log n (log n)-1 (log n)-1 … (log n)-2 (log n)-2 . . . . . . . . .
Question 3 – From Exam • The key in each of the nodes is exactly the running time of Build-heap on that node • As we have seen, the total running time of Build-heap is O(n) • Therefore the sum of all keys in the tree is also O(n)