1 / 25

DATA STRUCTURE

DATA STRUCTURE. Instructor: Dai Min Office: XNA602 Fall 2006. CHAPTER 1 Algorithms & Algorithm Analysis. What is a algorithm How to analyze algorithms Time Complexity Space Complexity Asymptotic Analysis: Big-oh. 1. 1 Algorithm. 1) Definition

kibo-hicks
Download Presentation

DATA STRUCTURE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DATA STRUCTURE Instructor: Dai Min Office: XNA602 Fall 2006

  2. CHAPTER 1 Algorithms & Algorithm Analysis • What is a algorithm • How to analyze algorithms • Time Complexity • Space Complexity • Asymptotic Analysis: Big-oh

  3. 1.1 Algorithm 1) Definition • An algorithm is a finite set of simple instructions to be followed to solve a problem. • An algorithm takes the input to a problem (function) and transforms it to the output. (A mapping of input to output.) • A problem can have many algorithms. • A computer program is an instance, or concrete representation, for an algorithm in some programming language.

  4. 2) Algorithm Properties • An algorithm possesses the following properties: • It must be correct. • It must be composed of a series of concrete steps. • There can be no ambiguity as to which step will be performed next. • It must be composed of a finite number of steps. • It must terminate.

  5. Criteria • Input: 0-n input • Output: 1 output • definiteness: clear and unambiguous • finiteness: terminate after a finite number of steps • feasibility: It must be possible to perform each instruction.

  6. 1.2 Algorithm Analysis • There are often many approaches (algorithms) to solve a problem. How do we choose between them? • At the heart of computer program design are two (sometimes conflicting) goals. (1) To design an algorithm that is easy to understand, code, debug —— the concern of Software Engineering. (2)To design an algorithm that makes efficient use of the computer’s resources —— the concern of data structures and algorithm analysis.

  7. When goal (2) is important, how do we measure an algorithm’s cost? • Performance Measurement (machine dependent) • Empirical comparison (run programs) • Performance Analysis (machine independent) • Time complexity: running time • Space complexity: storage requirement

  8. 1.2.1 Time Complexity 1) Time Complexity • Running time analysis • Factors affecting running time: Compile time… • Ignoring the factors machine-depended Running time = program step  per execution • For most algorithms, running time depends on “size” of the input. Running time is expressed as T(n) for some function T on input size n —— Time Complexity

  9. 2) Computing program step • A program step is a syntactically or semantically meaningful program segment whose execution time is independent of the instance characteristics. • Methods to compute the step count • Introduce variable count into programs • Tabular method • Determine the total number of steps contributed by each statementstep per execution  frequency • add up the contribution of all statements

  10. Example: Iterative function for summing a list of numbers float sum( float list[ ], int n){ int i; float tempsum = 0; for (i = 0; i < n; i++) tempsum + = list[i]; return tempsum; }

  11. 2n + 3 steps Example: compute the program with count statements added Float sum(float list[ ], int n){ int i; float tempsum = 0; count++;/* for assignment */ for (i = 0; i < n; i++) { count++;/*for the for loop */ tempsum += list[i]; count++;/* for assignment */ }count++;/* last execution of for */ return tempsum; count++;/* for return */ }

  12. Example: compute the program with Tabular Method Steps / execution

  13. 1.2.2 Asymptotic Time Complexity 1) Some mathematical definitions • T(n) = O(f(n)) if there exist positive constants c and n0 such that T(n)  cf(n) when n  n0. (the Asymptotic Upper Bound) • T(n) = W(g(n)) if there exist positive constants c and n0 such that T(n)  cg(n) when n  n0. (the Asymptotic Lower Bound) • T(n) = Q(k(n)) if and only if T(n) = O(k(n)) and T(n) = (k(n)). (the Asymptotic Tight Bound)

  14. Big-Oh Defined T(n) = O(f(n)) if there are constants c and n0 such that T(n) < c*f(n) when n  n0 c*f(n) is an upper bound for T(n) c*f(n) T(n) n0 n

  15. 2) Asymptotic Analysis • When analyzing an algorithm, we do not care about the behavior of each statement • We focus our analysis on the part of the algorithm where the greatest amount of its time is spent——critical section. • A critical section has the following characteristics: • It is an operation central to the functioning of the algorithm, and its behavior typifies the overall behavior of the algorithm • It is contained inside the most deeply nested loops of the algorithm and is executed as often as any other section of the algorithm.

  16. The critical section can be said to be at the "heart" of the algorithm • We can characterize the overall efficiency of an algorithm by counting how many times this critical section is executed as a function of the problem size

  17. Asymptotic Analysis:Big-oh • Ignoring constants in T(n) • Analyzing T(n) as n "gets large" Example: The big-oh (O) Notation

  18. General Rules • Strategy for analysis • analyze from inside out • analyze function calls first • if recursion behaves like a for-loop, analysis is trivial; otherwise, use recurrence relation to solve • For paratactic (consecutive) statements T(n, m) = T1 (n) + T2 (m) = O(max (f (n), g (m))) • For Nested statements T (n, m) = T1 (n) * T2 (m) = O(f (n)*g (m))

  19. Examples • Example 1: a = b; This assignment takes constant time, so it is O(1). • Example 2: sum = 0; for (i=1; i<=n; i++) sum += n; O(n) • Example 3: for (i =0; i < n; i++) for (j = 0; j <= i ; j++) sum ++;

  20. Example 4: sum = 0; for (i=1; i<=n; i++) for (j=1; j<=n; j++) sum++; for (i =0; i < n; i++) x=sum +i; Max (O(n2),O(n))=O(n2)

  21. Comparison of Growth Rates • O(1): constant • O(n): linear • O(n2): quadratic • O(n3): cubic • O(2n): exponential • O(log n) • O(nlog n) O(c) < O(log2n) < O(n) < O(nlog2n) < O(n2) < O(n3) < O(2n) < O(3n) < O(n!)

  22. 1.2.3 Space Complexity 1) Space Requirements • Space for a algorithm self —— e.g. instruction space, space for simple variables, etc (fixed space) • Space for inputs and outputs • Temp Space Requirements —— e.g. values of recursive stack space, formal parameters, localvariables, return address, etc

  23. 2) Space Complexity • Space complexity is measured by temp space requirement. • Space bounds can also be analyzed with asymptotic complexity analysis. S(n) =O(f(n))

  24. Space/Time Tradeoff • One can often reduce time if one is willing to sacrifice space, or vice versa. • Space/Time Tradeoff Principle: • The smaller you make the space requirements; • The faster your program will run.

  25. Thank You!

More Related