350 likes | 465 Views
Chapter 7 (Part 2) Sor ting A lgorithms. Merge Sort. Merge Sort. Basic Idea Example Analysis http://www.geocities.com/SiliconValley/Program/2864/File/Merge1/mergesort.html. Idea. Two sorted arrays can be merged in linear time with N comparisons only. Given an array to be sorted,
E N D
Chapter 7 (Part 2) Sorting Algorithms Merge Sort
Merge Sort • Basic Idea • Example • Analysis http://www.geocities.com/SiliconValley/Program/2864/File/Merge1/mergesort.html
Idea Two sorted arrays can be merged in linear time with N comparisons only. Given an array to be sorted, consider separately its left half and its right half, sort them and then merge them.
Characteristics • Recursive algorithm. • Runs in O(NlogN) worst-case running time. Where is the recursion? • Each half is an array that can be sorted using the same algorithm - divide into two, sort separately the left and the right halves, and then merge them.
Merge Sort Code void merge_sort ( int [ ] a, int left, int right) { if(left < right) { int center = (left + right) / 2; merge_sort (a,left, center); merge_sort(a,center + 1, right); merge(a, left, center + 1, right); } }
Analysis of Merge Sort Assumption:N is a power of two. For N = 1 time is constant (denoted by 1) Otherwise: time to mergesort N elements = time to mergesort N/2 elements + time to merge two arrays each N/2 el.
Recurrence Relation Time to merge two arrays each N/2 elements is linear, i.e. O(N) Thus we have: (a) T(1) = 1 (b) T(N) = 2T(N/2) + N
Solving the Recurrence Relation T(N) = 2T(N/2) + N divide by N: (1) T(N) / N = T(N/2) / (N/2) + 1 Telescoping:N is a power of two, so we can write (2) T(N/2) / (N/2) = T(N/4) / (N/4) +1 (3) T(N/4) / (N/4) = T(N/8) / (N/8) +1 ……. T(2) / 2 = T(1) / 1 + 1
Adding the Equations The sum of the left-hand sides will be equal to the sum of the right-hand sides: T(N) / N + T(N/2) / (N/2) + T(N/4) / (N/4) + … + T(2)/2 = T(N/2) / (N/2) + T(N/4) / (N/4) + …. + T(2) / 2 + T(1) / 1 +LogN (LogNis the sum of 1’s in the right-hand sides)
Crossing Equal Terms, Final Formula After crossing the equal terms, we get T(N)/N = T(1)/1 + LogN T(1) is 1, hence we obtain T(N) = N + NlogN = (NlogN) Hence the complexity of the Merge Sort algorithm is(NlogN).
Quick Sort • Basic Idea • Code • Analysis • Advantages and Disadvantages • Applications • Comparison with Heap sort and Merge sort http://math.hws.edu/TMCM/java/xSortLab/
Basic Idea • Pick one element in the array, which will be thepivot. • Make one pass through the array, called apartitionstep, re-arranging the entries so that: • entries smallerthan the pivot areto the leftof the pivot. • entrieslargerthan the pivot areto the right
Basic Idea • Recursively apply quicksortto the part of the array that is to the left of the pivot, and to the part on its right. • No merge step, at the end all the elements are in the proper order
Choosing the Pivot Some fixed element: e.g. the first, the last, the one in the middle. Bad choice- may turn to be the smallest or the largest element, then one of the partitions will be empty Randomly chosen(by random generator) - still abad choice
Choosing the Pivot Themedian of the array (if the array has N numbers, the median is the [N/2] largest number). This isdifficult to compute- increases the complexity.
Choosing the Pivot The median-of-three choice: take the first, the last and the middle element. Choose the median of these three elements.
Find the Pivot – Java Code int median3 ( int [ ]a, int left, int right) { int center = (left + right)/2; if (a [left] > a [ center]) swap (a [ left], a [center]); if (a [center] > a [right]) swap (a[center], a[ right]); if (a [left] > a [ center]) swap (a [ left], a [center]); swap(a[ center], a [ right-1]); returna[ right-1]; }
Quick Sort – Java Code If ( left + 10 < = right) { // do quick sort } else insertionSort (a, left, right);
Quick Sort – Java Code { int i = left, j = right - 1; for ( ; ; ) { while (a [++ i ] < pivot) { } while ( pivot < a [- - j ] ) { } if (i < j) swap (a[ i ], a [ j ]); else break; } swap (a [ i ], a [ right-1 ]); quickSort ( a, left, i-1); quickSort (a, i+1, right); }
Implementation Notes Compare the two versions: A. while (a[++i] < pivot) { } while (pivot < a[--j]){ } if ( i < j ) swap (a[i], a[j]); else break; B. while (a[ i ] < pivot) { i++ ; } while (pivot < a[ j ] ) { j- - ; } if ( i < j ) swap (a [ i ], a [ j ]); else break;
Implementation Notes If we have an array of equal elements, the second code will never increment i or decrement j, and will do infinite swaps. i and j will never cross.
Complexity of Quick Sort Average-case O(NlogN) Worst Case: O(N2) This happens when the pivot is the smallest (or the largest) element. Then one of the partitions is empty, and we repeat recursively the procedure for N-1 elements.
Complexity of Quick Sort Best-case O(NlogN) The pivot is the median of the array, the left and the right parts have same size. There are logN partitions, and to obtain each partitions we do N comparisons (and not more than N/2 swaps). Hence the complexity isO(NlogN)
Analysis • T(N) = T(i) + T(N - i -1) + cN • The time to sort the file is equal to • the time to sort the left partition withielements, plus • the time to sort the right partition with N-i-1elements, plus • the time to build the partitions.
Worst-Case Analysis The pivot is the smallest (or the largest) element T(N) = T(N-1) + cN, N > 1 Telescoping: T(N-1) = T(N-2) + c(N-1) T(N-2) = T(N-3) + c(N-2) T(N-3) = T(N-4) + c(N-3) …………... T(2) = T(1) + c.2
Worst-Case Analysis T(N) + T(N-1) + T(N-2) + … + T(2) = = T(N-1) + T(N-2) + … + T(2) + T(1) + c(N) + c(N-1) + c(N-2) + … + c.2 T(N) = T(1) + c times (the sum of 2 thru N) = T(1) + c (N (N+1) / 2 -1) = O(N2)
Best-Case Analysis The pivot is in the middle T(N) = 2T(N/2) + cN Divide by N: T(N) / N = T(N/2) / (N/2) + c
Best-Case Analysis Telescoping: T(N) / N = T(N/2) / (N/2) + c T(N/2) / (N/2) = T(N/4) / (N/4) + c T(N/4) / (N/4) = T(N/8) / (N/8) + c …… T(2) / 2 = T(1) / (1) + c
Best-Case Analysis Add all equations: T(N) / N + T(N/2) / (N/2) + T(N/4) / (N/4) + …. + T(2) / 2 = = (N/2) / (N/2) + T(N/4) / (N/4) + … + T(1) / (1) + c.logN After crossing the equal terms: T(N)/N = T(1) + c*LogN T(N) = N + N*c*LogN = O(NlogN)
Advantages and Disadvantages • Advantages: • One of the fastest algorithms on average • Does not need additional memory (the sorting takes place in the array - this is called in-place processing ) • Disadvantages: • The worst-case complexity is O(N2)
Applications Commercial applications • QuickSort generally runs fast • No additional memory • The above advantages compensate for the rare occasions when it runs withO(N2)
Applications Warning: Neveruse in applications which require aguarantee of response time: • Life-critical (medical monitoring, life support in aircraft, space craft) • Mission-critical (industrial monitoring and control in handling dangerous materials, control for aircraft, defense, etc) unless you assume the worst-case response time
Comparison with Heap Sort • O(NlogN)complexity • quick sort runs faster, (does not support a heap tree) • the speed of quick sort is not guaranteed
Comparison with Merge Sort • Merge sort guarantees O(NlogN) time • Merge sort requires additional memory with size N • Usually Merge sort is not used for main memory sorting, only for external memory sorting