380 likes | 410 Views
Analysis of Algorithms Chapter - 01 Introduction. This Chapter Contains the following Topics: Syllabus & Marks Distribution Study of Algorithms Algorithm Study of Algorithms Pseudocode Conventions Recursive Algorithms Sorting Problem Growth of Functions Objective
E N D
Analysis of Algorithms Chapter - 01 Introduction
This Chapter Contains the following Topics: • Syllabus & Marks Distribution • Study of Algorithms • Algorithm • Study of Algorithms • Pseudocode Conventions • Recursive Algorithms • Sorting Problem • Growth of Functions • Objective • Example of Function Growth • Complexity of Algorithm • Asymptotic Notations
Syllabus & Marks Distribution
Tentative Syllabus & Text Book • Introduction, Asymptotic Notations • Recurrences • Sorting Algorithms • Divide & Conquer • Binary search tree • Dynamic Programming • Greedy Graph Algorithms • Minimum spanning tree • Data Compression • Binomial heap • Fibonacci heap Text Books • Introduction to Algorithms, 2nd edition by Corman, Rivest,Leiserson, and Stein. • Computer Algorithms: Introduction to Design and Analysis By Sara Baase, 3rd Edition, Addison-Wesley Publishing Company
Grading Note: This is only a guide, percentages and rules may be changed during the semester as needed. Percentages: The final grade will be composed of the following Four components: ⋆ 50% : Final exam. ⋆ 20% : First exam. ⋆ 20% : Second exam. ⋆ 10% : Assignments and Quizzes.
Study of Algorithms
Algorithm • The word ‘algorithm’ comes from the name of a Persian author, Abu Ja’far Mohammad Ibn Musa Al Khowarizmi (825 AD). • Definition: An algorithm is a finite set of instructions that, if followed, carries out a particular task. In addition, all algorithms must satisfy the following criteria: • Input: Zero or more quantities are externally supplied. • Output: At least one quantity is produced. • Definiteness: Each instruction is clear and unambiguous. • Finiteness: If we trace out the instructions of an algorithm, then for all cases, the algorithm terminates after a finite number of steps. • Effectiveness: Every instruction must be very basic so that it can be carried out, in principle, by a person using only pen and paper. It must be feasible.
Study of Algorithms • The study of algorithms includes many important and active areas of research. • Given a problem, how do we find an efficient algorithm for its solution? • Once we have found an algorithm, how can we compare this algorithm with other algorithms that solve the same problem? • How should we judge the goodness of an algorithm? • About the study of algorithms, most researchers agree that there are four distinct areas of study one can identify. • How to devise algorithms? • How to validate algorithms? • How to test a program? • How to analyze algorithms?
Study of Algorithms (Contd.) • How to devise algorithms? • Creating an algorithm is an art which may never be fully automated. • How to validate algorithms? • Once an algorithm is devised, it is necessary to show that it computes the correct answer for all possible legal inputs. • A program can be written and then be verified. • How to test a program? • Debugging is the process of executing programs on sample data sets to determine whether the faulty results occur and, if so, to correct them. • Profiling is the process of executing a correct program on data sets and measuring the time and space it takes to compute the results.
Study of Algorithms (Contd.) • How to analyze algorithms? • Analysis of algorithms refers to the task of determining how much computing time and storage an algorithm requires. • It allows you to make quantitative judgments about the value of one algorithm over another. • It allows you to predict whether the software will meet any efficiency constraints that exist. • Questions such as how well does an algorithm perform in the best, worst and average cases. • Theoretical analysis: • All possible inputs. • Independent of hardware/ software implementation • Experimental Study: • Some typical inputs. • Depends on hardware/ software implementation. • A real program.
Pseudocode Conventions • Comments begin with // and continue until the end of line. • Blocks are indicated with matching braces: {and}. • An Identifier begins with a letter: max. • Assignment of values to variables is done using assignment statement : Variable:=expression. • Logical operators: and, or and not are provided. • Relational operators: <, ≤, =, ≠, ≥ and > provided. • Elements of arrays are accessed using: [ and ]. • While loop: while (condition) do { (statement 1) ……………… (statement n) } • For loop: for variable:=value1 to value2 step step do { (statement 1) ……………… (statement n) }
Pseudocode Conventions (Contd.) • Repeat-until loop: repeat { (statement 1) ………… (statement n) } until (condition) • Conditional statement: • if (condition) then (statement) • if (condition) then (statement 1) else (statement 2) • Case statement: case { : (condition 1): (statement 1) …………………. : (condition n): (statement n) : else: (statement n+1) } • Input and output are done using: read and write. • There is only one type of procedure: Algorithm. • An algorithm consists of a heading and a body. • The heading of an algorithm takes the form Algorithm Name ((parameter list))
Recursive Algorithms • A recursive function is a function that is defined in terms of itself. • Similarly, an algorithm is said to recursive if the same algorithm is invoked in the body. • An algorithm that calls itself is direct recursive. • Algorithm Ais said to be indirect recursive if it calls another algorithm which in turn calls A. Algorithm TowersOfHanoi (n, x, y, z) // Move the top n disks from tower x to tower y. { if (n ≥ 1) then { TowersOfHanoi (n-1, x, z, y); write (“Move top disk from tower”, x, “to top of tower”, y); TowersOfHanoi (n-1, z, y, x); } } • Illustrate the above algorithm with an example.
Sorting Problem • Input: A sequence of n numbers (a1, a2,…, an). • Output: A permutation of n numbers (reordering) (a'1, a'2,…, a'n) of the input sequence such that a'1≤ a'2 ≤… ≤ a'n. • Solve the problem using Insertion sort. • Insert an element to a sorted array such that the order of the resultant array be not changed. Algorithm InsertionSort (A, n) { for i:=2 ton do { key:=A[i]; // Insert A[i] into the sorted sequence A[1…i-1]. j:=i-1; while ( (j>0) and (A[j]>key) ) do { A[j+1]:=A[j]; j:=j-1; } A[j+1]:=key; } } • Illustrate the above algorithm with an example.
Loop Invariant and Correctness • At the start of each iteration of the ‘for loop’, the sub-array A[1…i-1] consists of the elements originally in A[1…i-1] but in sorted order. • We use Loop invariants to help us understand why an algorithm is correct. We must show three things about a loop invariants: • Initialization: It is true prior to the first iteration of the loop. • Maintenance: if it is true before an iteration of the loop, it remains true before the next iteration. • Termination: When the loop terminates, the invariant gives us a useful property that helps show that the algorithm is correct. • Let us see how these properties hold for insertion sort: • Initialization: Before 1st loop iteration, when i=2, the subarray A[1…i-1] consists of single element A[1], which is already sorted. • Maintenance: The body of outer ‘for loop’ works by moving A[i-1], A[i-2], A[i-3], and so on by one position to the right until the proper position for A[i] is found. • Termination: When i=n+1, the subarray A[1…n] consists of the elements originally in A[1…n], but in sorted order.
Analysis of Insertion Sort T(n) = c1n + c2(n-1) + c3(n-1) + c4Σti + c5 Σ(ti -1) + c6 Σ(ti -1) + c7(n-1) Best Case: ti=1, for i=2,3,…..n T(n) = c1n + c2(n-1) + c3(n-1) + c4(n-1) + c7(n-1) = (c1+c2+c3+c4+c7)n – (c2+c3+c4+c7) So, T(n)=an+b Worst Case: ti = i, for i=2,3,……,n. Σi= n(n+1)/2 – 1 Σ(i-1) = n(n-1)/2 T(n)= c1n + c2(n-1) + c3(n-1) + c4(n(n+1)/2 – 1) + c5 (n(n-1)/2) + c6 (n(n-1)/2) + c7(n-1) = (c4/2 + c5/2 + c6/2)n2 + (c1+c2+c3+c4/2-c5/2-c6/2+c7)n - (c2+c3+c4+c7) So, T(n)=an2+bn+c
Growth of Functions
Objective • Objective: A language to express that Algorithm A is better or worse or equivalent to Algorithm B. • Need to define “≤” between functions measuring the growth of functions. • Independence of the hardware/ software environment: Turing machine, RAM machine, classroom model, today computers, and future super-computers. • Ignore constants that can be affected by changing the environment.
Examples of Function Growth • Maximum size of a problem that can be solved in one second, one minute, and one hour, given below: • Increase in the maximum size of a problem by using a computer that is 256 times faster than the previous one. • Each entry is given as a function of m, the previous maximum problem size.
Complexity of Algorithm • How much of resources the algorithm requires? • Usually: time and space (memory). • Complexity: as a function of the input length n. • Usually an integer n > 0. • Usually a monotonic non-decreasing function T(n). • The limiting behavior of the complexity as size increases is called the asymptotic complexity of the algorithm. Worst Case and Average Case Complexity • T(n) is a worst case complexity: • If for all inputs of length n the complexity is T(n). • T(n) is an averagecase complexity: • If the average complexity over all length n inputs is T(n). • Averaging following some distribution of the inputs. • Usually the uniform distribution.
Asymptotic Notations • Big-Oh:f(n) = O(g(n)) if f(n)asymptotically less than or equal tog(n). • Big-Omega:f(n) = ῼ(g(n)) if f(n)asymptotically greater than or equal tog(n). • Big-Theta:f(n) = ⊖(g(n)) if f(n)asymptotically equal tog(n). • Little-oh:f(n) = o(g(n)) if f(n)asymptotically strictly less thang(n). • Little-omega:f(n) = ω(g(n)) if f(n)asymptotically strictly greater thang(n).
Big-Oh, Big-Omega, Big-Theta • Big-Oh:: f(n) = O(g(n)), if • There exists a real constant c > 0 and an integer constant n0 > 0 such that f(n) ≤ cg(n) for every integer n ≥ n0. • Big-Omega:: f(n) = ῼ(g(n)), if • There exists a real constant c > 0 and an integer constant n0 > 0 such that f(n) ≥ cg(n) for every integer n ≥ n0. • Big-Theta:: f(n) = ⊖(g(n)), if • Thereexist two real constants c′, c′′ > 0 and an integer constant n0 > 0 such that c′′g(n) ≤ f(n) ≤ c′g(n) for every integer n ≥ n0.
Examples • 3n + 2 = O(n) As 3n + 2 ≤ 4n, for all n≥2. • 100n + 6 = O(n) As 100n + 6 ≤ 101n, for all n≥ 6. • 10n2 + 4n + 2 = O(n2) As 10n2 + 4n + 2 ≤ 11n2, for all n≥ 5. • 1000n2 + 100n - 6 = O(n2) As 1000n2 + 100n - 6 ≤ 1001n2, for all n≥100. • 6*2n + n2 =O(2n) As 6*2n + n2 ≤ 7*2n, for all n≥4. • 3n + 2 = O(n2) 3n + 2 ≤3n2, for all n≥2. • But 3n + 2 ≠ O(1) and 10n2 + 4n + 2 ≠ O(n). • 3n + 2 = Ω(n) As 3n + 2 ≥ 3n, for all n≥1. • 10n2 + 4n + 2 = Ω (n2) As 10n2 + 4n + 2 ≥ n2, for all n ≥ 1. • 6*2n + n2 = Ω(2n) As 6*2n + n2 ≥ 2n, for all n≥1. • 3n + 2 = Θ(n) 3n ≤ 3n + 2 ≤ 4n, for all n ≥2. • 10n2 + 4n + 2 = Θ(n2) • 6*2n + n2 = Θ(2n)
Little-oh and Little-omega • f(n) = o(g(n)) if limn→∞f(n)/g(n) = 0: • For any constant c > 0 there exists an integer constant n0 > 0 such that f(n) < cg(n) for every integer n ≥ n0. • f(n) = ω(g(n)) if limn→∞f(n)/g(n) = ∞: • For any constant c > 0 there exists an integer constant n0 > 0 such that f(n) > cg(n) for every integer n ≥ n0.
Algorithm Analysis: Example • Alg.: MIN (a[1], …, a[n]) m ← a[1]; for i ← 2 to n if a[i] < m then m ← a[i]; • Running time: • the number of primitive operations (steps) executed before termination T(n) =1 [first step] + (n) [for loop] + (n-1) [if condition] + (n-1) [the assignment in then] = 3n - 1 • Order (rate) of growth: • The leading term of the formula • Expresses the asymptotic behavior of the algorithm
Extra Examples • 2n2 = O(n3): • n2 = O(n2): n = O(n2): 2n2≤ cn3 2 ≤ cn c = 1 and n0= 2 n2≤ cn2 c ≥ 1 c = 1 and n0= 1 n ≤ cn2 cn ≥ 1 c = 1 and n0= 1
Examples • 100n + 5 ≠ (n2) For the above question, we had a solution proposed as follows to prove 100n + 5 = (n2) :- cn2< 100n + 5 Let no = 100 . That gives us, c x 10000 < 100 x 100 + 5 c < 10005 / 10000 So the contant c = 1.0005 But does this work when c = 1.0005 and n = 200 ( remember, it should satisfy for all n > no . Here n is 200 and no is 100) No, it doesn’t work. Hence you cannot prove that 100n + 5 = (n2)
Examples • Prove that 100n + 5 = O(n2) • 100n + 5 ≤ 100n + n = 101n ≤ 101n2 for all n ≥ 5 n0 = 5 and c = 101is a solution • 100n + 5 ≤ 100n + 5n = 105n ≤ 105n2for all n ≥ 1 n0 = 1 and c = 105is also a solution Must findSOMEconstants c and n0 that satisfy the asymptotic notation relation
Nested Dependent Loop for i =1 to n do for j = i to n do sum=sum+1