190 likes | 294 Views
Functional Design and Programming. Lecture 4: Sorting. Literature (Pensum). Paulson, chap. 3: Sorting (3.18, 3.19). Exercises. Paulson, chap. 3: 3.38-3.42 (obligatory: 3.38, 3.41). Overview. Sorting Insertion sort Merge sort Quick sort Algorithmic complexity analysis (time)
E N D
Functional Design and Programming Lecture 4: Sorting
Literature (Pensum) • Paulson, chap. 3: • Sorting (3.18, 3.19)
Exercises • Paulson, chap. 3: • 3.38-3.42 (obligatory: 3.38, 3.41)
Overview • Sorting • Insertion sort • Merge sort • Quick sort • Algorithmic complexity analysis (time) • Asymptotic notation • Worst-case complexity
Sorting • Problem: Given a list l of n numbers, return a list containing the elements of l in min-sorted order (least element first). • Algorithmic approaches: • incremental (insertion sort): • process one element at a time • divide-and-conquer (mergesort, quicksort): • divide problem into smaller subproblems • conquer subproblems (solve them recursively) • combine solutions to subproblems
l (list) 5 8 4 2 1 4 9 7 8 xs (tail of l) x (head of l) 1 2 4 4 7 8 8 9 sort 5 insert x into (sort xs) Insertion sort (isort)
l (list) 5 8 4 2 1 4 9 7 8 left (left half of l) right (right half of l) sort sort 2 4 5 8 1 4 7 8 9 Mergesort
1 4 7 8 9 merge 1 2 4 4 5 7 8 8 9 2 4 5 8 Mergesort...
sort sort 4 2 1 4 8 7 9 8 1 2 4 4 7 8 8 9 Quicksort l (list) 5 8 4 2 1 4 9 7 8 pivot <= 5 (without pivot!) > 5 pivot 5
5 append 1 2 4 4 5 7 8 8 9 1 2 4 4 7 8 8 9 Quicksort...
Performance • Programs require resources to execute • time (execution time) • space (memory) • Performance of a program depends on many factors: • algorithm and data structures used • compiler, host machine, implementation tuning, etc.
Algorithmic Complexity • Goal: • We would like measure the performance quality of algorithms independent of implementation particulars • Measures: • time (execution time): time complexity • space (memory): space complexity
Asymptotic analysis • Performance depends on inputs: • larger inputs require more time and space • Asymptotic complexity: • measure scalability:how much more time and space do we need to process yet bigger inputs? • worst-case analysis: Worst case for inputs of certain size • average-case analysis: Average case ... • best-case analysis: Best possible case ...
Constant time operations: Definition • Operations that take (at a maximum) a fixed (“constant”) number of machine instructions to implement, independent of the size of the data they are operating on.
Constant-time operations: Examples • Looking up the value of an identifier (including the definition of a function) • Binding a value to an identifier: val x = 5 • Executing a primitive operation on fixed-size data (bool, int, real, char, but not string, list) • Applying a constructor; e.g. nil or :: for lists. • Performing a pattern match against patterns defined in a function definition or case expression
Constant-time operations: Examples... • Passing an argument value to a function (independent of the size of the argument value!) • Returning the result from a function (indepdendent of the size of the argument value!)
Asymptotic notation (“Big-Oh notation”) • Let f, g functions from Nat to Nat (nonnegative integers). • We write f(n) = O(g(n)): • if there exist positive constants c, n0 such that • f(n) <= c g(n) for all n >= n0. • We write f(n) = Q(g(n)): • if there exist positive constants c1, c2, n0 s.t. • c1 g(n) <= f(n) <= c2 g(n) for all n >= n0.
Q(n log n) log n n
Q(n2) n ... n