190 likes | 351 Views
Thought for the Day. “Years wrinkle the skin, but to give up enthusiasm wrinkles the soul.” – Douglas MacArthur. Reminder. Test tomorrow 21 April 7:45am Geology C11 (i.e. here!) Covering: Up to section 8.2, p. 149. Dominant term. Big-O Notation. Or order notation.
E N D
Thought for the Day “Years wrinkle the skin, but to give up enthusiasm wrinkles the soul.”– Douglas MacArthur
Reminder • Test tomorrow • 21 April • 7:45am • Geology C11 (i.e. here!) • Covering: • Up to section 8.2, p. 149
Dominant term Big-O Notation • Or order notation • Assume we can find a function giving the number of operations: Order n2 or more simply: O(n2) f(n) = 3.5n2 + 120n + 45
Order Notation • Formal Definition We say that an algorithm is O(f(n)), or that the algorithm is of the order of f(n), if there exist positive constants c and n0 such that the time t required to execute the algorithm is determined by: tcf(n) for all n > n0
Ranking of Common Functions • 1 - Constant Complexity • Unlikely • log n- Logarithmic Complexity • Very good • n- Linear Complexity • Directly related to n • nlog n • Quite common, not bad
Ranking (cont.) • n2- Quadratic Complexity • Getting impractical for large n • nm- Polynomial Complexity • Hierarchy of their own • 2n- Exponential Complexity • Useless except for very small n • n! - Factorial Complexity • Even worse! (e.g. Cramer’s Rule)
A Perspective • Assume computers get 1 000 000 times faster/more powerful • i.e. ± 30 years from now! • How much bigger can we make the dataset? • O(n): 1 000 000 times • O(n2): 1 000 times • O(2n): + 20
Be Aware • Order notation loses information • O(n2) might really be 0.0005n2 + 3000n – 6 • If the constants have extreme values and n is small we may be better off with a simpler, less efficient algorithm
Simple Examples • Very few O(1) algorithms • single sequence of statements, no loops or recursion for (int k = 0; k < 10; k++) // do something • Also O(1) • no dependence on input data
Simple Examples (cont.) for (int k = 0; k < DATA_LENGTH; k++) // do something to the k'th data element • O(n) for (int k = 0; k < DATA_LENGTH; k++) for (int j = 0; j < DATA_LENGTH; j++) // do something to the k'th element and // the j'th element • O(n2)
Simple Examples (cont.) • O(n) • Data is simply arranged in a matrix for (int r = 0; r < NUMBER_OF_ROWS; r++) for (int c = 0; c < NUMBER_OF_COLS; c++) // do something to the (r,c)'th element
Simple Examples (cont.) for (int k = 0; k < DATA_LENGTH; k++) for (int j = 0; j < (DATA_LENGTH-k); j++) // do something to the k'th element and // the j'th element • First time around the outer loop: n iterations • Second time: n-1 • Third time: n-2 • ... • Last time: 1
Example for (int k = 0; k < DATA_LENGTH; k++) for (int j = 0; j < (DATA_LENGTH-k); j++) // do something to the k'th element and // the j'th element • Total number: n + (n-1) + (n-2) + … + 2 + 1 = n(n+1)/2 = n2/2 + n/2 O(n2)
Complexity: A Real Example • Searching through a list for a value • n is the number of elements in the list • Worst case? • The value cannot be found • Need exactly n comparisons: O(n)
Searching (cont.) • Best case? • It’s the first element! • Unlikely! • Needs only 1 comparison: O(1)
Searching (cont.) • Average case? • Need some assumptions about what “average” means • Assume the value is always found • sometimes near the start • sometimes near the end • On average: halfway through the list • Needs n/2 comparisons: O(n)
Searching (cont.) • Assumption: • The value is found only half of the time Half the time we need n comparisons Half the time we need n/2 (on average) Overall: 3n/4 • Still O(n)
Conclusion • Objective, compiler- and computer-independent ways of comparing algorithms are essential • Order notation provides this