1 / 56

Performance analysis of algorithms

Performance analysis of algorithms. What is Programming?. Programming is to represent data and solve the problem using the data. Data Structures. Algorithms. What is Data Structure?. Definition It is a way of collecting and organizing data in a computer.

couch
Download Presentation

Performance analysis of algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance analysis of algorithms

  2. What is Programming? Programmingisto represent data and solve the problem using the data. Data Structures Algorithms

  3. What is Data Structure? • Definition • It is a way of collecting and organizing data in a computer. • An aggregation of atomic and composite data into a set with defined relationships. Data Structures Non-Primitive Primitive Arrays Stacks Linear Integer Lists Queues Non-Linear Float Trees Files Character Graphs

  4. What is Algorithm? • Definition: a finite set of instructions that should satisfy the following: • Input: zero or more inputs • Output: at least one output • Definiteness: clear and unambiguous instructions • Finiteness: terminating after a finite number of steps • Effectiveness (Machine-executable): enough to be carried out • In computational theory, algorithm and program are different • Program does not satisfy 4): eg. OS

  5. Algorithm Specification • How to express algorithms • High-level description • Natural language • Graphic representation, e.g., flowcharts • Pseudocode: informal language-dependent description • Implementation description • C, C++, Java, and etc.

  6. Natural Language vs. Graphic Chart • Example: Selection Sort From those integers that are currently unsorted, find the smallest value. Place it next in the sorted list.

  7. Pseudocode (C-like Language) • Example: Selection Sort for (i=0; i<n; i++) { Examine numbers in list[i] to list[n-1]. Suppose that the smallest integer is at list[min]. Interchange list[i] and list[min]. }

  8. Implementation in C • Example: Selection Sort void sort(intlist[], intn) { inti, j, min; for (i = 0; i < n - 1; i++) { min = i; for (j = i + 1; j < n; j++) if (list[j] < list[min]) min = j; SWAP(list[i], list[min]); } }

  9. Algorithm Analysis • How do we evaluate algorithms? • Does the algorithm use the storage efficiently? • Is the running time of the algorithm acceptable for the task? • Performance analysis • Estimating machine-independent time and space void search(intarr[], intlen, inttarget) { for (inti = 0; i < len; i++) { if (arr[i] == target) returni; } return -1 } Space complexity: an amount of memory needed Time complexity: an amount of time taken for an algorithm to finish

  10. Space Complexity • Definition • (machine-independent) space required by an algorithm • Example intabc(int a, int b, int c) { return a + b + b*c + 4.0; } char* give_me_memory(intn) { char *p = malloc(n); return p; }

  11. Space Complexity • What is better for space complexity? float sum(float *list, intn) { floattempsum = 0; for (inti = 0; i < n; i++) tempsum += *(list + i); returntempsum; } floatrsum(float *list, intn) { if (n > 0) return rsum(list, n - 1) + list[n - 1]; else return 0; }

  12. Time Complexity • Definition • (machine-independent) time required by an algorithm • Time is not easy to estimate! • Alternative • Count the number of program steps instead of time. 10 Additions, 10 subtractions, 10 multiplications Þ 10Ca+10Cs+10Cm Ca: time for one addition Cs: time for one subtraction Cm: time for one multiplication 10 Additions, 10 subtractions, 10 multiplications Þ30 steps

  13. Time Complexity • Program steps • Syntactically or semantically meaningful program segment whose execution time is independent of the number of inputs • Any one basic operation Þ one step • +, -, *, /, assignment, jump, comparison, etc. • Any combination of basic operations Þ one step • +=, *=, /=, (a+c*d), etc.

  14. Time Complexity • Example intabc(int a, int b, int c) { return a + b + b*c + 4.0; } void add(inta[][MAX_SIZE], ...) { inti, j; for (i = 0; i < rows; i++) for (j = 0; j < cols; j++) c[i][j] = a[i][j] + b[i][j]; }

  15. Time Complexity • What is better for time complexity? float sum(float *list, intn) { floattempsum = 0; for (inti = 0; i < n; i++) tempsum += *(list + i); returntempsum; } floatrsum(float *list, intn) { if (n > 0) return rsum(list, n - 1) + list[n - 1]; else return 0; }

  16. Asymptotic Notation • Do we need to calculate exact numbers? • What is a important factor? INCREASING SPEED!! • The highest term is enough to represent the complexity. • Constants is not important. • We havethree algorithms,A, B, and C forthesameproblem. • The timecomplexity of A:n2+n+1 • The timecomplexity of B:n2 • The timecomplexity of C:200*n*log(n)

  17. Asymptotic Notation • What is better? • (10n+10) vs. (0.01n2+10) • (2000n+3) vs. (nlogn+1000) • (n3 ) vs. (10n2+1000000n) Simple rule: 1. Ignore any constants. 2. Compare only the term of the highest order.

  18. Asymptotic Notation • Three notations for complexity • Big-O notation : O(f(n)) • The complexity is not increasing faster than f(n). • f(n) is an upper bound of the complexity. • Big- notation : (f(n)) • The complexity is not increasing slower than f(n). • f(n) is a lower bound of the complexity. • Big-  notation : (f(n)) • The complexity is equally increasing to f(n).

  19. Big-O Notation • Definition • Example • 3n + log n + 2 = (n), because 3n + log n + 2 £ 5n for n ³2 • 10n2 + 4n + 2 = (n4), because 10n2 + 4n + 2 £ 10n4 for n ³ 2 f(n) = (g(n))  f(n) is not increasing faster than g(n)  For a large number no, c*g(n) will be larger than f(n)  There exist positive constants c and no such that f(n) £c*g(n) for all n > no

  20. Example: Asymptotic Notation • Three asymptotic notations for space complexity float sum(float *list, intn) { floattempsum = 0; for (inti = 0; i < n; i++) tempsum += *(list + i); returntempsum; } (1) O(1) (1) 16 bytes floatrsum(float *list, intn) { if (n > 0) return rsum(list, n - 1) + list[n - 1]; else return 0; } (n) O(n) (n) 8*n bytes

  21. Example: Asymptotic Notation • Three asymptotic notations for time complexity float sum(float *list, intn) { floattempsum = 0; for (inti = 0; i < n; i++) tempsum += *(list + i); returntempsum; } (n) O(n) (n) 2n+3 void add(inta[][MAX_SIZE], ...) { inti, j; for (i = 0; i < r; i++) for (j = 0; j < c; j++) c[i][j] = a[i][j] + b[i][j]; } (r*c) O(r*c) (r*c) 2*r*c + 2*r + 1

  22. Discussion: Asymptotic Notation • Big-O notation is widely used. • Big- notation is the most informative, but the exact value is hard to know. • Big- notation is the least informative. Why? • Big-O notation is good for rough description. There is algorithm A. The exact complexity is very hard to evaluate. However, we know that n2£ complexity £ n3. Then, we can say the complexity is O(n3).

  23. Comparison: Asymptotic Notation • Which is more costly? • O(1), O(log n), O(n log n), O(n2), O(n3), O(2n), O(n!), etc.. O(1): constant O(log2n): logarithmic O(n): linear O(n·log2n): log-linear O(n2): quadratic O(n3): cubic O(2n): exponential O(n!): factorial

  24. Guideline for Asymptotic Analysis • Loops • The number of iterations * the running time of the statements inside the loop // executes n times for (inti = 0; i < n; i++) m = m + 1; // constant time, c // Total time = c * n = O(n) // outer loop executed n times for (inti = 0; i < n; i++) // inner loop executed n times for (int j = 0; j < n; j++) m = m + 1; // constant time, c // Total time = c * n * n = O(n2)

  25. Guideline for Asymptotic Analysis • Consecutive statements • Add the time complexities of each statement. // executes n times for (inti = 0; i < n; i++) m = m + 1; // constant time, c1 // outer loop executed n times for (inti = 0; i < n; i++) // inner loop executed n times for (int j = 0; j < n; j++) k = k + 1; // constant time, c2 // Total time = c1 * n + c2 * n2 = O(n2)

  26. Guideline for Asymptotic Analysis • If-then-else statements • Consider the worst-case running time among the if, else-if, or else part (whichever is the larger). // executes n times if (len > 0) for (inti = 0; i < n; i++) m = m + 1; // constant time, c1 else { // outer loop executed n times for (inti = 0; i < n; i++) if (i > 0) k = k + 2 // constant time, c2 else // inner loop executed n times for (int j = 0; j < n; j++) k = k + 1; // constant time, c3 } // Total time = n * n * c3 = O(n2)

  27. Guideline for Asymptotic Analysis • Logarithmic complexity • An algorithm is O(logn) if it takes a constant time to cut the problem size by a fraction (usually by ½). // At kth step, 2k = n and come out of loop. for (inti = 1; i < n; i*=2) m = m + 1; // constant time, c // Because k = log2n, total time = O(log n) // The same condition holds for decreasing sequence. for (inti = n; i > 0; i/=2) m = m + 1; // constant time, c // Because k = log2n, total time = O(log n)

  28. Recursion

  29. What is Recursion? • Definition • A repetitive process in which an algorithm calls itself #include<stdio.h> void Recursive(intn) { // Base case: termination condition! if (n == 0) return; printf("Recursive call: %d\n", n); Recursive(n - 1); }

  30. Example:Summing from 1 to n • Iterative vs. Recursive programming int sum(intn) { int sum = 0; for (inti = 0; i < n; i++) sum = sum + i; return sum; } intrsum(intn) { if (n == 0) return 0; else return rsum(n - 1) + n; }

  31. Designing Recursive Programming • Two parts • Base case: Solve the smallest problem directly. • Recursive case: Simplify the problem into smaller ones and calculate a recurrence relation. int S(intn) { if (n == 0) return 0; else return S(n - 1) + n; }

  32. Function Call/Return • Is it a correct code? • Stack overflow: Eventually halt when runs out of (stack) memory. void Recursive(intn) { printf("Recursive call: %d\n", n); Recursive(n - 1); } void Recursive(intn) { // Base case: termination condition! if (n == 0) return; else { printf("Recursive call: %d\n", n); Recursive(n - 1); } }

  33. Example:Summing from 1 to n • How does recursive programming work? • How many calls/returns happen? intrsum(intn) { if (n == 0) return 0; else return rsum(n - 1) + n; } call S(3) S(2)+3 S(1)+2 S(0)+1 0 call S(2) call S(1) return 6 call S(0) return 3 return 1 return 0

  34. Function Call/Return • How is stack memory changed? • When a function calls, it is sequentiallystored in stack memory. • The returned address is kept into system stack. • All local variables are newly allocated into system stack. S(0) S(1) S(2) S(3) S(2)+3 S(1)+2 S(0)+1 0 call S(3) call S(2) call S(1) call S(0)

  35. Function Call/Return • How is stack memory changed? • When a function is returned, it is removed from stack memory. • All local variables are removed. • Return the address kept in the stack. S(0) S(1) S(2) S(3) call S(3) S(2)+3 S(1)+2 S(0)+1 0 call S(2) call S(1) return 6 call S(0) return 3 return 1 35 return 0

  36. Recursion vs. Iteration • Iteration • Terminate when a condition is proven to be false. • Each iteration does NOT require extra memory. • Some iterative solutions may not be as obvious as recursive solutions. • Recursion • Terminate when a base case is reached. • Each recursive call requires extra space on stack memory. • Some solutions are easier to formulate recursively.

  37. Summing Multiples of Three • Calculate the sum of all multiples of three from 0 to n. int sum(intn) { int sum = 0; for (inti = 0; i <= n; i+=3) sum = sum + i; return sum; } intrsum(intn) { if (n == 0) return 0; else if (n % 3 != 0) return rsum(n – n % 3); else return rsum(n - 3) + n; }

  38. Finding Maximum Number • Search the maximum number in array. intfindMax(int* arr, intn) { int max = arr[0]; for (inti = 1; i < n; i++) { if (arr[i] > max) max = arr[i]; } return max; }

  39. Finding Maximum Number • Search the maximum number in array. intrfindMax(int* arr, intn) { if (n == 1) returnarr[0]; else { int max = rfindMax(arr, n - 1); if (max < arr[n - 1]) returnarr[n - 1]; else return max; } }

  40. Printing Reverse String • Print a string in a reverse manner. • rprint("abc")  cba, rprint("2a1bc")  cb1a2 voidrprint(char* s, intn) { for (inti = n - 1; i >= 0; i--) printf("%c", s[i]); } voidrrprint(char* s, intn) { if (n == 0) return; else { printf("%c", s[n - 1]); returnrrprint(s, n - 1); } }

  41. Printing Binary Number • Print a binary number using recursion. • Note: Input a positive integer only. • binary(1)  1, binary(3)  11 • binary(10)  1010, binary(109)  1101101 void binary(intn) { if (n == 0) return; else { binary(n / 2); printf("%d", n % 2); } }

  42. Calculating the Power of X • Iterative vs. recursive programming int power(intx, intn) { int pow = 1; for (int i = 0; i < n; i++) pow *= x; return pow; } intrpower(intx, intn) { if (n == 0) return 1; elsereturnx * rpower(x, n - 1); }

  43. Calculating the Power of X • How to implement recursion more efficiently? intrpower(intx, intn) { if (n == 0) return 1; elsereturnx * rpower(x, n - 1); } int rpower2(intx, intn) { if (n == 0) return 1; elseif (n % 2 == 0) { int m = rpower2(x, n / 2); return m * m; } elsereturnx * rpower2(x, n - 1); }

  44. Calculating Fibonacci Numbers • Every number is the sum of two preceding ones. • 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, … intfibo(intn) { if (n == 1 || n == 2) return 1; else { intprev = 1, cur = 1, next = 1; for (int i = 3; i <= n; i++) { prev = cur, cur = next; next = prev + cur; } return next; } } intrfibo(intn) { if (n == 1 || n == 2) return 1; else return rfibo(n - 1) + rfibo(n - 2); }

  45. Recursion for Fibonacci Numbers • How many calls happen? 1 fibo(7) fibo(6) fibo(5) 2 17 3 12 18 23 fibo(5) fibo(4) fibo(4) fibo(3) 16 19 22 24 25 9 4 13 fibo(4) fibo(3) fibo(3) fibo(2) fibo(3) fibo(2) fibo(1) fibo(2) fibo(3) fibo(2) fibo(1) fibo(2) fibo(1) fibo(2) fibo(1) fibo(2) 5 8 10 11 14 15 20 21 fibo(1) fibo(2) 6 7

  46. Binary Search using Recursion • Compare the median value in search space to the target value. • Can eliminate half of the search space at each step. • It will eventually be left with a search space consisting of a single element. intbsearch(intarr[], intlow, inthigh, inttarget) { if (low > high) return -1; else { int mid = (low + high) / 2; if (target == arr[mid]) return (mid); elseif (target < arr[mid]) bsearch(arr, low, mid - 1, target); else bsearch(arr, mid + 1, high, target); } }

  47. Time Complexity for Recursion • How to calculate time complexity for recursion • T(n): the maximum amount of time taken on input of size n • Formulate a recurrence relation with sub-problems. int S(intn) { if (n == 0) return 0; else return S(n - 1) + n; } ……

  48. Time Complexity for Recursion • Time complexity for binary search intbsearch(intarr[], intlow, inthigh, inttarget) { if (low > high) return -1; else { int mid = (low + high) / 2; if (target == arr[mid]) return (mid); elseif (target < arr[mid]) bsearch(arr, low, mid - 1, target); else bsearch(arr, mid + 1, high, target); } }

  49. Recursion Tree • Visualizing how recurrences are iterated • The recursion tree for this recurrence has the following form:

  50. Recursion Tree • In the recursion tree, • The depth of the tree does not really matter. • The amount of work at each level is decreasing so quickly that the total is only a constant factor more than the root.

More Related