370 likes | 454 Views
Analysis of Algorithms. Two Approaches for measuring running time of a program. Benchmarking A small collection of typical inputs that can serve as performance standards Analysis Determining the general time a program takes as a function of input size. Running Time. T(n)
E N D
Two Approaches for measuring running time of a program • Benchmarking • A small collection of typical inputs that can serve as performance standards • Analysis • Determining the general time a program takes as a function of input size
Running Time • T(n) • a function to represent the units of time taken by a program with input of size n • T(n) » # statements executed
Running Time • Tw(n) • worst case running time • maximum running time among all inputs of size n • Tavg(n) • average running time • avg running time over all inputs of size n • more realistic; harder to compute
Calculate T(n) small = i; for (j=i+1; j<=n; j++) { if (A[j] < A[small]) small = j; • Count 1 unit of time for each assign, compare
Big-O notation • A tool to analyze a program's efficiency. • An approximation of • work an algorithm performs • as a function of size of input
Work • A measure of • effort expended by computer • in performing a computation
Big-O • f(x) is O(g(x)) • if there are constants C and k such that | f(x) | £ C | g(x) | whenever x > k
Show f(x) = x2+2x+1 is O(x2) • Prove: • |x2+2x+1|£ C | x2 | • Choose k > 0, and f(x) is always positive • Let C be the sum of f(x) coefficients • x2+2x+1£ 1x2+2x2+1x2 = 4x2 • This inequality hold beginning at x=1 • Choosing k=1, C=4, x2+2x+1 is O(x2)
Show f(x) = 7x3 is O(x2) • Prove: • |7x3|£ C | x2 | • Assuming k > 0, divide both sides by x2 • 7x£ C • No such C exists, since x is arbitrarily large • (Choose any C, and there is an x > C) • \ 7x3 is not O(x2)
Commonly used Big-O values (orders of magnitude) Big-O Name O(1) Constant time Lower O(log n) Logarithmic time O(n) Linear time O(n log n) O(n2) Quadratic time O(n3) Cubic time O(nk) Polynomial time O(2n) Exponential time Higher O(n!) Factorial time Lower is always O(higher) for n³1
Big-O Example algorithm • O(1) Assigning value to ith array element; always the same number of steps not necessarily short a program which takes 320 steps, reguardless of input values • O(log n) Binary search, What power of 2 is greater than a input number? A loop whose terminal value is being successively halved or doubled
Big-O Example algorithm • O(n) Printing all elements of an array; • searching an unordered array; • A loop which executes 1 to N times • (where N is the input size, • the number of data values being processed) • O(n log n) Faster sorts; • A loop whose index is being halved/doubled • inside a loop executing from 1 to N
Big-O Example algorithm • O(n2) Slower sorts; • A loop which executes from 1 to N times • inside a loop executing from 1 to N times • O(n3) Incrementing all the elements in a NxNxN array; • A 1..N loop inside a 1..N loop inside a 1..N loop • O(nk) k levels of loops inside loops
Big-O Example algorithm • O(2n) List all the subsets of a set with n elements; • practical only with small values of n; • n=64 takes 5 years on a supercomputer • O(n!) List all the possible arrangements of a set with n elements
n2 n! 2n n log n n log n
Computing order of complexity of an algorithm • a.) Multiplicative constants do not matter. • O(3n) = O(n) • b.) Addition is done by taking the max. • O(n2) + O(n) = O(max(n2, n)) = O(n2) • O(n3) + O(log n) = O(max(n3, log n)) = O(n3) • c.) Multiplication remains multiplication. • O(n) * O(log n) = O(n log n) • O(n) * O(n2) = O(n3)
Example Complexities • 4n3 + 20n + 30 = O(n3) • n + 10000 = O(n) • 4n4 + 20n + 30 = O(n4) • 2n + n3 = O(2n) • 200 = O(1)
output input assignment if if-else a block while loop for loops nested for loops What is the Big-O of
Running Time of Statements • Simple statements • (i.e. initialization of variables) are O(1) • Loops are O(g(n) f(n)) • g(n) is upper bound on number of loop iterations • f(n) is upper bound on the body of the loop. (If g(n) and f(n) are constant, then this is constant time).
Running Time of Statements (continued) • Conditional statements are O(max(f(n),g(n))) where f(n) is upper bound on then part and g(n) is upper bound on else part • Blocks of statements with complexities f1(n), f2(n), ..,fk(n) have complexity O(f1(n) + f2(n)+ ...+ fk(n))
Simple Analysis cin >> n; // 1 if (n > 20) // 1 cout << “Number is > 20” << endl; // 1 else cout << “Number is <= 20” << endl; // 1 T(n) = 2 + max (1,1) = 3 = O(1)
Example Analysis cin >> n; // 1 factorial = 1; // 1 for (i = 2; i <=n; i++) // 1 init + 1 test + //(1 test + 1 inc) per iteraton factorial *= i; // 1 cout << factorial; // 1 T(n) = 5 + 3*(n-1) = O(n)
Another example cin >> n; // 1 if (n > 0) { // 1 factorial = 1; // 1 for (i = 2; i <=n; i++) // 1 initialization +1 test + //(1 test + 1 inc) per iteration factorial *= i; // 1 cout << factorial; // 1 } else cout << “Can’t do factorial of” << n; // 1 T(n) = 2 + max(4+3*(n-1), 1) = O(n)
Analysis of simple function calls int factorial(int n) { int fact=1; for (i = 2; i <=n; i++) fact *= i; return fact; } void main() { int n; // 1 cin >> n; // 1 cout << factorial(n) << endl; //O(n) } Main is O(n)
Analysis of Nested Loops EXAMPLE 1: for (int i = 0; i < n; i++) // n*O(n) = O(n2) for (int j=0; j<n;j++) // n*O(1) = O(n) x++; // O(1) EXAMPLE 2: for (int i = 0; i < n; i++) for (int j=i; j<n;j++) x++; // O(1) T(n) = n+(n-1) + (n-2)+…+ 1 = n(n+1)/2 = O(n2)
Summing blocks of statements for (int i = 0; i < n; i++) // O(n2) for (int j=0; j<n;j++) x++; // Next block follows in same program for (int i = 0; i < n; i++) // O(n) x++; T(n) = O(n2) + O(n) = O(n2 + n) = O(n2)
Loops with different limits for (int i = 0; i < n; i++) // n*O(m) = O(mn) for (int j=0; j<m;j++) // m*O(1) = O(m) x++; // O(1)
Recursive Algorithms Complexity of a recursive algorithm is the product of • Amount of work done on any one level • Number of recursive calls
Recursive Factorial int factorial(int n) { if ( n <= 1) return 1; else return n * factorial(n - 1); } void main() { int n; // 1 cin >> n; // 1 cout << factorial(n) << endl; //O(n) } Main is O(n)
More complex loops for (i = 1;i < n; i *=2) x++; for (i=n; i > 0; i/=2) x++; Repetitive halving or doubling results in logarithmic complexity. Both loops are O(log n)
Worst-Case and Average-Case Analysis • Worst-case is a guarantee over all inputs of a given size • Average-case is the running time as an average over all the possible inputs of a given size
output input assignment if if-else a block while loop for loops nested for loops What is the Big-O of