1 / 25

Efficiency and Analysis of Algorithms in CS Design

Learn about the efficiency and analysis principles of algorithms in CS Design, including worst case and average time analysis, elementary operations, and algorithm types.

mattier
Download Presentation

Efficiency and Analysis of Algorithms in CS Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 615: Design & Analysis of Algorithms Chapter 2: Efficiency of Algorithms

  2. Course Content • Introduction, Algorithmic Notation and Flowcharts (Brassard & Bratley, Chap. Chapter 3) • Efficiency of Algorithms (Brassard & Bratley, Chap. 2) • Basic Data Structures (Brassard & Bratley, Chap. 5) • Sorting (Weiss, Chap. 7) • Searching (Brassard & Bratley Chap.: 9) • Graph Algorithms (Weiss, Chap.: 9) • Randomized Algorithms (Weiss, Chap.: 10) • String Searching (Sedgewick, Chap. 19) • NP Completeness (Sedgewick, Chap. 40) CS 615 Design & Analysis of Algorithms

  3. Definitions • Problem: • A situation to be solved by an algorithm • Example: • Multiply two integers • Instance • A special case of the problem • Example: • Multiply(981, 1234) • An algorithm must work correctly • On every instance of the problem it claims to solve • To prove an algorithm is not correct • Find an instance which the algorithm cannot solve correctly CS 615 Design & Analysis of Algorithms

  4. Efficiency of Algorithms • To decide which algorithm to choose: • Empirical Approach • Program the competing algorithms • Try them on different instances • with the help of the computer(s) • Resources: • Computing time • Storage space • Number of processes (for parallel algorithms) CS 615 Design & Analysis of Algorithms

  5. Efficiency of Algorithms • To decide which algorithm to choose: • Theoritical Approach • Using formal methods to anlyze the efficiency • Does not depend on the computer • No need to make “programming” CS 615 Design & Analysis of Algorithms

  6. Efficiency of Algorithms • To decide which algorithm to choose • Hybrid Approach • Describe algorithm’s efficiency function theoritically • Empirically determine numerical parameters • for a particular machine • Predict the time an actual implementation will take • to solve an instance CS 615 Design & Analysis of Algorithms

  7. Principle of Invariance • Two different implementations of an algorithm • Will not differ in efficiency • by more than some multiplicative constant • Example • If the constant is 5: • if the first implementation • takes 1 second to solve an instance • then a second implementation (possible on a different machine) • Will not take more than 5 seconds CS 615 Design & Analysis of Algorithms

  8. Principle of Invariance • For an instance of size n • Implementation 1: • Takes time of t1(n) • Implementation 2: • Takes time of t2(n) • There always exist positive constants c and d such that • t1(n)c*t2(n) and • t2(n)d *t1(n) • where n is sufficiently large CS 615 Design & Analysis of Algorithms

  9. Results of Principle of Invarience • A change on the implementation of the same algorithm • Can only cause a constant of change in efficiency • The principle does not depend on • The computer we use • The compiler we implement • The abilities of the person making the coding • If we want a radical change in efficiency • We need to change the algorithm itself CS 615 Design & Analysis of Algorithms

  10. Theoritical Efficiency • For a given function t • an algorithm for some problem • takes a time in the order of t(n), • if there exist a positive constant c • the algorithm is capable of • solving every instance of size • then • n is not more than • c*t(n) seconds/hours/years. • For numerical problems • n may sometimes be the value • rather than the size of the instance CS 615 Design & Analysis of Algorithms

  11. Algorithm Types • Time takes to solve an instance of a • Linear Algorithm is • Never greater than c*n • Quadratic Algorithm is • Never greater than c*n2 • Cubic Algorithm is • Never greater than c*n3 • Polynomial Algorithm is • Never greater than nk • Exponential Algorithm is • Never greater than cn where c & k are appropriate constants CS 615 Design & Analysis of Algorithms

  12. Worst Case Analysis • When to analyse an algorithm • Considering the cases only take • maximum amount of time • If the algorithm is capable of solving cases in t(n) • then the worst case should not be greater than • c*t(n) • Useful if the algorithm is to be applied to cases • the upper bound of an algorithm must be known • Example: • Response time for a nuclear power plant. CS 615 Design & Analysis of Algorithms

  13. Average Time Analysis • If the algorithm is going to be used many times • it is useful to know • the average execution time • on instances of size n • It is harder to analyse the average case • the distribution of data should be known • Insertion sorting average time is • in the order of n2 CS 615 Design & Analysis of Algorithms

  14. Elementary Operation • Is one whose execution time is • bounded above by a constant • The constant does not depend on • The size • or other parameters of the instance considered • Example • x=y+w*z is it elementary operation? • Suppose • ta : time to execute an addition (constant) • tm : time to execute a multiplication (constant) • ts : time to execute an assignment (constant) • t: time required to execute an addition, multiplication, & asignment: • ta ta + m tm + ts s where a,m,s are constants • t  max(ta , tm , ts ) x (a+m+s) CS 615 Design & Analysis of Algorithms

  15. Elementary Operation • A single line of of program • may correspond to a variable number of elemenary operations • x=min{T[i] | 1 i n} • Time required to compute min • increases with n • min() is not an elementary operation ! CS 615 Design & Analysis of Algorithms

  16. Elementary Operation • addition, multiplication: • Normally • Time required to compute • addition, multiplication • Depends on the length of the operands • But • it is reasonable to assume • addition and multiplication are elementary operations • when the operands are in fixed length CS 615 Design & Analysis of Algorithms

  17. Some Algorithm Examples • Calculating Determinants • Sorting • Multiplication of Large Integers • Calculating the Greatest Common Divisor • Calculating Fibonacci Sequences • Fourier Transforms CS 615 Design & Analysis of Algorithms

  18. Calculating Determinants • Recursive definition of Algorithm • To compute a determinant of n x n matrix • Takes time proportional to n! • Worse than taking exponential time • Experiments: • 5 x 5 matrix 20 sec. • 10 x 10 matrix 10 min. • Estimation • 20 x 20 matrix 10 million years. • Gauss-Jordan Elimination • To compute a determinant of n x n matrix • Takes time proportional to n3 • Experiments: • 10 x 10 matrix 0.01 sec. • 20 x 20 matrix 0.05 sec. • 100 x 100 matrix 5.5 sec. CS 615 Design & Analysis of Algorithms

  19. Sorting • Arranging n objects based on • the “ordering function” defined for these objects • No sorting algorithm is • faster than order of nlogn • Insertion sorting • Takes time proportional to n2 • Experiment: • Sorting 1000 elements takes 3 sec. • Estimation: • Sorting 100 000 elements would take 9.5 hrs. • Selection sorting • Takes time proportional to n2 CS 615 Design & Analysis of Algorithms

  20. Sorting • Heapsort • Takes time proportional to nlogn • even in worst cases • Mergesort • Takes time proportional to nlogn • even in worst cases • Quicksort • Takes time proportional to nlogn • Experiment: • Sorting 1000 elements takes 0.2 sec. • Sorting 100 000 elements takes 30 sec. CS 615 Design & Analysis of Algorithms

  21. Multiplication of Large Integers • When multiplying large integers • Operands might become too large • to hold in a single word • Assume two large integers • of sizes m and n are to be multiplied • Multiply each digit of one integer • by the digit of the second digit • Takes time proportional to m x n • More efficient algorithms: • Divide-and-conquer: • Takes time proportional to n x mlg(3/2) • =n x m0.59 • m is the size of the smaller integer CS 615 Design & Analysis of Algorithms

  22. Calculating the Greatest Common Divisor function gcd(m,n) i=min(m,n)+1 repeat i=i-1 until i divides both m and n exactly return i • Denoted by gcd(m,n) • Finding the largest integer • divides both m and n exactly • gcd(6,15)=3 • gcd(10,21)=1 • gcd algorithm takes time of order n • Euclid’s algorithm takes order of logn function Euclid(m,n) while m>0 do t=m m=n mod m n=t return n CS 615 Design & Analysis of Algorithms

  23. Calculating Fibonacci Sequences • Fibonacci Sequence: • f0=0; • f1=1; • fn= fn-1 + fn-2 • Order of Fibrec is fn • Order of Fibiter is n function Fibrec(n) if n<2 then return n else return Fibrec(n-1)+Fibrec(n-2) function Fibiter(n) i=1 j=0 for k=1 to n do j=i+j i=j-i return j CS 615 Design & Analysis of Algorithms

  24. Fourier Transforms • One of the most useful algorithm in history • Used in • Optics • Acoustics • Quantum physics • Telecommunications • System theory • Signal processing • Speech processing • Example • Used to analyze data from • earthquake in Alaska 1964 • Classic algorithm takes 26 minutes of computation • A new algorithm need less than 2.5 seconds CS 615 Design & Analysis of Algorithms

  25. End of Chapter 3Efficiency of Algorithms CS 615 Design & Analysis of Algorithms

More Related