340 likes | 357 Views
Study the balance properties of AVL trees, insertion and deletion methods, rotations to maintain balance, and the importance of tree structure. Explore the intricacies of tree balancing.
E N D
CSE 326: Data Structures Trees Lecture 7: Wednesday, Jan 23, 2003
Outline • Finish discussion on random binary search trees (BST) • AVL trees • Reading assignment for this week:Weiss: 4.3, 4.4, 4.5, and 4.7
The Average Depth of a BST • Insert the elements 1 <2 < ... < nin some order, starting with the empty tree • For each permutation, : • T = the BST after inserting (1), (2) , ... , (n) • The Average Depth:
The Average Depth of a BST • The average depth of a BST is: H(n) = (log n) • For some , height(T) = O(log n) • For other , height(T) = O(n) • But the average is O(log n) • Please read the proof in the book and/or slides !
Random Input vs. Random Trees Trees Inputs 1,2,3 3,2,1 1,3,2 3,1,2 2,1,3 2,3,1 For three items, the shallowest tree is twice as likely as any other – effect grows as n increases. For n=4, probability of getting a shallow tree > 50%
Average cost • The average, amortized cost of n insert/find operations is O(log(n)) • But the average, amortized cost of n insert/find/delete operations can be as bad as (n) • Deletions make life harder (recall stretchy arrays) • Need guaranteed cost O(log n) – next
Beauty is Only (log n) Deep • Binary Search Trees are fast if they’re shallow e.g.: complete • Problems occur when one branch is much longer than the other How to capture the notion of a “sort of” complete tree?
Balance t balance = height(left subtree) - height(right subtree) • convention: height of a “null” subtree is -1 • zero everywhereperfectly balanced • small everywherebalanced enough: (log n) • Precisely: Maximum depth is 1.44 log n 6 5
Binary search tree properties Balance of every node is -1b 1 Tree re-balances itself after every insert or delete AVL Tree (Adelson-Velskii Landis) 8 5 11 2 6 10 12 4 7 9 13 14 15 What is the balance of each node in this tree?
AVL Tree Data Structure 10 data 3 3 height 10 children 1 2 5 15 0 0 1 0 12 20 2 9 0 0 17 30
Not An AVL Tree 10 data 4 4 height 10 children 1 3 5 15 0 0 2 0 12 20 2 9 1 0 17 30 0 18
Bad Case #1 Insert(small) Insert(middle) Insert(tall) 2 S 1 M 0 T
Single Rotation 2 1 S M 1 M 0 0 S T 0 T Basic operation used in AVL trees: A right child could legally have its parent as its left child.
General Case: Insert Unbalances h + 1 h + 2 a a h - 1 h + 1 h - 1 h b X b X h h-1 h - 1 h - 1 Z Y Z Y h + 1 b h h a Z h - 1 h - 1 Y X
Properties of General Insert + Single Rotation • Restores balance to a lowest point in tree where imbalance occurs • After rotation, height of the subtree (in the example, h+1) is the same as it was before the insert that imbalanced it • Thus, no further rotations are needed anywhere in the tree!
Bad Case #2 Insert(small) Insert(tall) Insert(middle) 2 S 1 T Why won’t a single rotation (bringing T up to the top) fix this? 0 M
Double Rotation 2 2 S S 1 M 1 1 M T 0 0 0 S T 0 T M
General Double Rotation h + 3 a h + 2 h + 2 c h b Z h+1 h+1 b a h h+1 W h h h c Y X W Z h Y X • Initially: insert into X unbalances tree (root height goes to h+3) • “Zig zag” to pull up c – restores root height to h+2, left subtree height to h
Another Double Rotation Case h + 3 a h + 2 h + 2 c h b Z h+1 h+1 b a h h+1 W h h h c Y X W Z Y h X • Initially: insert into Y unbalances tree (root height goes to h+2) • “Zig zag” to pull up c – restores root height to h+1, left subtree height to h
Insert Algorithm • Find spot for value • Hang new node • Search back up looking for imbalance • If there is an imbalance: “outside”: Perform single rotation and exit “inside”: Perform double rotation and exit
AVL Insert Algorithm Node insert(Comparable x, Node root){ if ( root == NULL ) return new Node(x); if (x == root.key) return root; if (x < root.key){ root.left = insert( x, root.left ); if (root unbalanced) { rotate... } } else { // x > root.key root.right = insert( x, root.right ); if (root unbalanced) { rotate... } } root.height = max(root.left.height, root.right.height)+1; return root; }
3 2 2 1 0 0 1 0 0 0 Deletion (Really Easy Case) Delete(17) 10 5 15 12 2 9 20 3 17 30
3 2 2 1 0 0 1 0 0 0 Deletion (Pretty Easy Case) Delete(15) 10 5 15 12 2 9 20 3 17 30
Deletion (Pretty Easy Case cont.) 3 Delete(15) 10 2 2 5 17 1 0 0 1 12 2 9 20 0 0 3 30
3 2 2 1 0 0 1 0 0 Deletion (Hard Case #1) Delete(12) 10 5 17 12 2 9 20 3 30
3 2 2 1 0 1 0 0 Single Rotation on Deletion 3 10 10 2 1 5 17 5 20 1 0 0 0 2 9 20 2 9 17 30 0 3 30 3 What is different about deletion than insertion?
4 2 3 1 0 2 2 0 Deletion (Hard Case) Delete(9) 10 5 17 12 12 2 9 20 20 0 1 0 1 3 30 30 11 15 15 18 0 0 0 0 0 33 33 13 13
4 4 1 2 3 3 0 1 0 2 2 2 2 Double Rotation on Deletion Not finished! 10 10 5 17 3 17 12 12 2 2 20 2 5 20 0 0 1 0 1 0 1 0 1 3 30 30 11 15 18 11 15 18 0 0 0 0 0 33 33 13 13
4 1 3 0 0 2 2 Deletion with Propagation 10 What different about this case? 3 17 12 2 5 20 0 1 0 1 We get to choose whether to single or double rotate! 30 11 15 18 0 0 33 13
4 1 3 0 0 2 2 Propagated Single Rotation 4 10 17 3 2 3 17 10 20 1 2 0 1 18 12 2 5 20 12 3 30 0 1 0 1 0 0 0 1 0 30 11 15 18 2 5 11 33 15 0 0 0 33 13 13
4 1 3 0 0 2 2 Propagated Double Rotation 4 10 12 2 3 3 17 10 17 1 0 1 2 15 20 12 2 5 20 11 3 0 1 0 1 0 1 0 0 0 30 18 30 11 15 18 2 5 13 0 0 0 33 33 13
Recursive If at node, delete it Otherwise recurse to find it in 3. Correct heights a. If imbalance #1, single rotate b. If imbalance #2 (or don’t care), double rotate Iterative 1. Search downward for node, stacking parent nodes 2. Delete node 3. Unwind stack, correcting heights a. If imbalance #1, single rotate b. If imbalance #2 (or don’t care) double rotate AVL Deletion Algorithm
Pros and Cons of AVL Trees • Pro: • All operations guaranteed O(log N) • The height balancing adds no more than a constant factor to the speed of insertion • Con: • Space consumed by height field in each node • Slower than ordinary BST on random data • Can we guarantee O(log N) performance with less overhead? Splay trees next time