310 likes | 328 Views
Understand how running time scales with input size, express running time using Big-O notation, analyze average case, and explore asymptotic complexity in algorithm design. Learn how to determine order notations and compare growth rates effectively.
E N D
Defining Efficiency • Asymptotic Complexity - how running time scales as function of size of input • Two problems: • What is the “input size” ? • How do we express the running time ? (The Big-O notation, and its associates)
Input Size • Usually: length (in characters) of input • Sometimes: number of “items” (Ex: numbers in an array, nodes in a graph) • Input size is denoted n. • Which inputs (because there are many inputs at a given size)? • Worst case: look at the input of size n that causes the longest running time; tells us how good an algorithm works • Best case: look at the input on which the algorithm is the fastest; it is rarely relevant • Average case: useful in practice, but there are technical problems here (next)
Input Size Average Case Analysis • Assume inputs are randomly distributed according to some “realistic” distribution • Compute expected running time • Drawbacks • Often hard to define realistic random distributions • Usually hard to perform math
Input Size • Example: the function: find(x, v, n)(find v in the array x[1..n]) • Input size: n (the length of the array) • T(n) = “running time for size n” • But T(n) needs clarification: • Worst case T(n): it runs in at most T(n) time for any x,v • Best case T(n): it takes at least T(n) time for any x,v • Average case T(n): average time over all v and x
Amortized analysis • Instead of a single execution of the alg., consider a sequence of consecutive executions: • This is interesting when the running time on an execution depends on the result of previous ones • Worst case analysis over the sequence of executions • Determine average running time on this sequence • Will illustrate in the course
Definition of Order Notation • Upper bound: T(n) = O(f(n)) Big-O Exist constants c and n’ such that T(n) c f(n) for all n n’ • Lower bound: T(n) = (g(n)) Omega Exist constants c and n’ such that T(n) c g(n) for all n n’ • Tight bound: T(n) = (f(n)) Theta When both hold: T(n) = O(f(n)) T(n) = (f(n)) Other notations: o(f), (f) - later
Which Function Dominates? g(n) = 100n2 + 1000 log n 2n + 10 log n n! 1000n15 3n7 + 7n f(n) = n3 + 2n2 n0.1 n + 100n0.1 5n5 n-152n/100 82log n Question to class: is f = O(g) ? Is g = O(f) ?
Race I f(n)= n3+2n2 vs. g(n)=100n2+1000
Race II n0.1 vs. log n
Race III n + 100n0.1 vs. 2n + 10 log n
Race IV 5n5 vs. n!
Race V n-152n/100 vs. 1000n15
Race VI 82log(n) vs. 3n7 + 7n
Eliminate low order terms • Eliminate constant coefficients
Common Names Slowest Growth constant: O(1) logarithmic: O(log n) linear: O(n) log-linear: O(n log n) quadratic: O(n2) exponential: O(cn) (c is a constant > 1) hyperexponential: (a tower of n exponentials Other names: superlinear: (nc) (c is a constant > 1) polynomial: O(nc) (c is a constant > 0) Fastest Growth
Execution time comparison O(1) < O(log n) < O(n) < O(n log n) < O(n2) <O(2n) < O(n!) < O(22n) Size 10 30 50 O(log n)0.00004 sec 0.00009 sec 0.00012 sec O(n)0.0001 sec 0.0003 0.0005 O(n2)0.01 0.09 0.25 O(n5)0.1 24.3 5.2 minutes O(2n)0.001 sec 17.9 minutes 35.7 years O(3n)0.059 6.5 years 2*108 centuries An algorithm is infeasible if its running time is super-polynomial - it takes too long to compute
The big-O arithmetic Rule 1 (transitivity): If T(n) = O(f(n)) and f(n) = O(g(n)) then T(n) = O(g(n)) Proof: Given T(n) <= C1 * f(n) for n > n0’ f(n) <= C2 * g(n) for all n > n0” Substitute f(n) with C2 *g(n) we have: T(n) <= C1*C2* g(n) for all n > max(n0’, n0”) = C3*g(n) where C3=C1*C2 Conclusion: T(n) = O(g(n))
Cont. big-O Arithmetic Rule 3 (max rule: in a sum drop the smaller term): (a) T1(n) + T2(n) = max(T1(n), T2(n)) Example: T1(n) = 2n2+4n T2(n) = 4n3+2n T1(n)+T2(n)= O(4n3+2n) =O(n3) (b) T1(n) * T2(n) = O(f(n) * g(n)) Example: T1(n) = 3n2 f(n) = O(n2) T2(n) = 4n5 g(n) =O(n5) T1(n) * T2(n) = 3n2 *4n5 = O(n7)
Cont. of big-O rules Rule 4: If T(n) is a polynomial of degree k, then T(n) = O(nk) where n is the variable and k is the highest power Example: n5+4n4+8n2+9 = O(n5) 7n3+2n+19 = O(n3) Remarks: • 2n2 = O(n3) and 2n2 = O(n2) are both correct, but O(n2) is more accurate • The following notations are to be avoided: - O(2n2) - O(5) -O(n2+10n) • The big-O notation is an upper bound
Big-O – by taking limits Determine the relative growth rates by computing limit n-> f(n)/g(n) (1) if the limit is 0 then f(n) = O(g(n)) more precisely: f(n) = o (g(n)) (2) if the limit is C 0 then f(n) = O(g(n)) and g(n) = O(f(n)) more precisely f(n) = (g(n)) (3) if the limit is then f(n) = (g(n)) more precisely: f(n) = (g(n))
General Rules for Computing Time Complexity (1) O(1) if the execution time does not depend on the input data (2) Apply Addition if the code segments are placed consecutively (3) Apply Multiplication if the code segments are nested (4) O(log n) if the number of executions is reduced in half repeatedly (5) O(2n) if the number of executions is doubled repeatedly (6) Always consider the worst case
Big-O examples (1) cin >> n; for (k=1; k<=n; k++) for(j=1; j<=k; j++) for(m=1; m<=n+10; m++) a = 10; Time complexity is O(n3) (2) cin >> n for(j=0; j < myfunc(n); j++) cout << j for(k=1; k<= n; k++) cout << j << k; Time complexity is: (a) O(n2) if myfunc(n) returns n (b) O(nlogn) if myfunc(n) returns log n
Big-O examples(2) (3) cin >> n; total = 0 for(j=1; j <n; j++) cin >> x total = x + total if((total %2) == 1) cout << total else for(k=1; k<=j; k++) cout << k; Time complexity: O(n2)
Big-O examples(3) (4) float f(float n) { sum =0; for(j=1; j<n; j++) sum = sum +j; return sum;} Time complexity: O(n) (5) float g(float n) { sum = 0 for(j=0; j<n; j++) sum = sum +f(j) } Time complexity: O(n2)
Big-O examples(4) (6) float h(int n) {for (j=0; j<n; j++) cout << g(n) + f(n); } Time complexity: O(n3) int j,n,k; // Now what? (7) cin >> n; power =1; for(j=1; j< n; j++) power = power * 2 for(k=1; k< power; k++) cout << k Time complexity: O(2n)
Big-O examples(5) (8) cin >> n; x =1 while (x < n) x = 2*x Time Complexity: O(log n)