0% found this document useful (0 votes)
20 views36 pages

RecursiveAlgo RunningTime

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views36 pages

RecursiveAlgo RunningTime

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Iterative and Recursive

Algorithms
Review: Asymptotic Notation
• Upper Bound Notation:
– f(n) is O(g(n)) if there exist positive constants c
and n0 such that f(n)  c  g(n) for all n  n0
– Formally, O(g(n)) = { f(n):  positive constants c
and n0 such that f(n)  c  g(n)  n  n0
• Big O fact:
– A polynomial of degree k is O(nk)
Review: Asymptotic Notation
• Asymptotic lower bound:
– f(n) is (g(n)) if  positive constants c and n0 such
that 0  cg(n)  f(n)  n  n0
• Asymptotic tight bound:
– f(n) is (g(n)) if  positive constants c1, c2, and n0
such that c1 g(n)  f(n)  c2 g(n)  n  n0
– f(n) = (g(n)) if and only if
f(n) = O(g(n)) AND f(n) = (g(n))
Other Asymptotic Notations
• A function f(n) is o(g(n)) if  positive constants
c and n0 such that
f(n) < c g(n)  n  n0
• A function f(n) is (g(n)) if  positive constants
c and n0 such that
c g(n) < f(n)  n  n0
• Intuitively,
o() is like < () is like > () is like =
O() is like  () is like 
Iterative Algorithms
1. Write an algorithm for linearsearch(a[1..n],start,data) that searches for data in a
starting from position start. Let the function return the position of data if found; -1
otherwise.
2. Write an algorithm for insertmiddle(a[1..n], data,after) that inserts data at position
after+1.
3. Use these algorithms to implement insertafterdata(a[1..n], data1, data2) that
inserts data2 after every occurrence of data1 in a.
• Eg. Input:
• a[7] 45 13 25 13 43 25 13
• data1 13
• data2 33
• Output
• 45 13 33 25 13 33 43 25
13 33

• Time Complexity?
Recursive Algorithm: Factorial
T ( n)
= T (n − 1) + d
= T ( n − 2) + d + d
int factorial (int n) { = T (n − 3) + d + d + d
if (n<=1) return 1; = ....
else return n * factorial(n-1); = T (1) + (n − 1) * d
} = c + (n − 1) * d
factorial (n) = n*n-1*n-2* … *1 = O ( n)
T(n)
n * factorial(n-1)
T(n-1)
n-1 * factorial(n-2)
T(n-2)
n-2 * factorial(n-3)

2 *factorial(1)

T(1)
6
Iterative Algorithm: Factorial

int factorial1(int n) {
if (n<=1) return 1; O (1)
else {
fact = 1; O (1)
for (k=2;k<=n;k++)
fact *= k;
return fact;
O(n)

}
}
• Both algorithms are O(n).
7
Merge Sort

1. Divide the array A[1 .. n] into two sub arrays


A[1 ..m] and A[m+1 .. n], where m = ceil(n/2)
2. Recursively mergesort the sub arrays A[1 ..m]
and A[m+1 .. n].
3. Merge the newly-sorted sub arrays A[1 ..m]
and A[m+1 .. n] into a single sorted list.
Merge Sort
• The first step is completely trivial—we only
need to compute the median index m—and we
can delegate the second step to the Recursive
sub routine
• All the real work is done in the final step; the
two sorted sub arrays A[1 ..m] and A[m+1 .. n]
can be merged using a simple linear-time
algorithm.
• For simplicity, we separate out the merge step
as a subroutine
Merge Sort
MERGE-SORT(A,first, last)
1. if first < last
then mid = ((last – first)/2)+first
MERGE-SORT(A, first, mid)
MERGE-SORT(A, mid+1,last)
MERGE(A, first, mid, last)
2. return
Merge Sort
MERGE(A, first, mid, last)
1. Lsize = mid – first + 1
2. Rsize = last – mid
// arrays L[] and R[]
3. For (i = 0; i < Lsize; i++) 7. // Copy the remaining elements of L[]
L[i] = A[first + i] While (i < Lsize)
EndFor A[k] = L[i]
4. For (j = 0; j < Rsize; j++)
i++
R[j] = A[mid + 1 + j]
EndFor
k++
5. // Merge L[] and R[] into A EndWhile
i=j=k=0 8. // Copy the remaining elements of R[]
6. While (i < Lsize && j < Rsize) While (j < Rsize)
if (L[i] <= R[j]) A[k] = R[j]
then A[k] = L[i]; j++
i++ k++
else A[k] = R[j]; EndWhile
j++
k++
EndWhile
Trace for Merge sorting
A[i]: 25 19 35 2 73 34 62 89 17 10 23

Trace? Stack frames?


Merge Sort – Running Time
T(k) = time taken to sort k elements
M(k) = time taken to merge k elements
T(N) = 2 * T(N/2) + M(N)
= 2 * T(N/2) + constant * N
These N/2 elements are further divided into two halves. So,
T(N) = 2 * [2 * T(N/4) + constant * N/2] + constant * N
= 4 * T(N/4) + 2 * N * constant
…..
= 2k * T(N/2k) + k * N * constant

It can be divided maximum until there is one element left.


N/2k = 1
k = log2N
T(N) = N * T(1) + N * log2N * constant
= N + N * log2N

Therefore, the time complexity is O(N * log2N).


Recurrences
• The expression:
 c n =1

T ( n) = 

2T   + cn n  1
n
  2 
is a recurrence.
– Recurrence: an equation that describes a function
in terms of its value on smaller functions
Recurrence Examples

 0 n=0  0 n=0
s ( n) =  s ( n) = 
c + s(n − 1) n  0 n + s (n − 1) n  0

n =1 
 c  c n =1
 
T ( n) =  T ( n) = 
2T   + c n  1
 n  n
  2  aT   + cn n  1
 b
Solving Recurrences
• Iteration method
• Substitution method
• Master method
Solving Recurrences
• “iteration method”
– Expand the recurrence
– Work some algebra to express as a summation
– Evaluate the summation
• We will show several examples
 0 n=0
s ( n) = 
c + s(n − 1) n  0
• s(n) =
c + s(n-1)
c + c + s(n-2)
2c + s(n-2)
2c + c + s(n-3)
3c + s(n-3)

kc + s(n-k) = ck + s(n-k)
 0 n=0
s ( n) = 
c + s(n − 1) n  0
• So far for n >= k we have
– s(n) = ck + s(n-k)
• What if k = n?
– s(n) = cn + s(0) = cn
 0 n=0
s ( n) = 
c + s(n − 1) n  0
• So far for n >= k we have
– s(n) = ck + s(n-k)
• What if k = n?
– s(n) = cn + s(0) = cn
• So  0 n=0
s ( n) = 
c + s(n − 1) n  0
• Thus in general
– s(n) = cn
 0 n=0
s ( n) = 
n + s (n − 1) n  0
• s(n)
= n + s(n-1)
= n + n-1 + s(n-2)
= n + n-1 + n-2 + s(n-3)
= n + n-1 + n-2 + n-3 + s(n-4)
= …
= n + n-1 + n-2 + n-3 + … + n-(k-1) + s(n-k)
 0 n=0
s ( n) = 
n + s (n − 1) n  0
• s(n)
= n + s(n-1)
= n + n-1 + s(n-2)
= n + n-1 + n-2 + s(n-3)
= n + n-1 + n-2 + n-3 + s(n-4)
= …
= n + n-1 + n-2 + n-3 + … + n-(k-1) + s(n-k)
n

= i
i = n − k +1
+ s(n − k )
 0 n=0
s ( n) = 
n + s (n − 1) n  0
• So far for n >= k we have
n

i
i = n − k +1
+ s(n − k )
 0 n=0
s ( n) = 
n + s (n − 1) n  0
• So far for n >= k we have
n

i
i = n − k +1
+ s(n − k )

• What if k = n?
 0 n=0
s ( n) = 
n + s (n − 1) n  0
• So far for n >= k we have
n

i
i = n − k +1
+ s(n − k )

• What if k = n?
n
n +1
n


i =1
i + s (0) =  i + 0 = n
i =1 2
 0 n=0
s ( n) = 
n + s (n − 1) n  0
• So far for n >= k we have
n

i
i = n − k +1
+ s(n − k )

• What if k = n?
n
n +1
n


i =1
i + s (0) =  i + 0 = n
i =1 2
• Thus in general
n +1
s ( n) = n
2
 c n =1
 n
T (n) = 2T
  + c n 1
  2 
• T(n) =
2T(n/2) + c
2(2T(n/2/2) + c) + c
22T(n/22) + 2c + c
22(2T(n/22/2) + c) + 3c
23T(n/23) + 4c + 3c
23T(n/23) + 7c
23(2T(n/23/2) + c) + 7c
24T(n/24) + 15c

2kT(n/2k) + (2k - 1)c
 c n =1
 n
T (n) = 2T
  + c n 1
  2 
• So far for n > 2k we have
– T(n) = 2kT(n/2k) + (2k - 1)c
• What if k = lg n?
– T(n) = 2lg n T(n/2lg n) + (2lg n - 1)c
= n T(n/n) + (n - 1)c
= n T(1) + (n-1)c
= nc + (n-1)c = (2n - 1)c
Solving Recurrences
• The substitution method (CLR 4.1)
– A.k.a. the “making a good guess method”
– Guess the form of the answer, then use induction
to find the constants and show that solution
works
– Examples:
• T(n) = 2T(n/2) + (n) T(n) = (n lg n)
• T(n) = 2T(n/2) + n ???
Solving Recurrences
• The substitution method (CLR 4.1)
– A.k.a. the “making a good guess method”
– Guess the form of the answer, then use induction
to find the constants and show that solution
works
– Examples:
• T(n) = 2T(n/2) + (n) → T(n) = (n lg n)
• T(n) = 2T(n/2) + n → T(n) = (n lg n)
• T(n) = 2T(n/2 )+ 17) + n → ???
Solving Recurrences
• The substitution method (CLR 4.1)
– A.k.a. the “making a good guess method”
– Guess the form of the answer, then use induction
to find the constants and show that solution
works
– Examples:
• T(n) = 2T(n/2) + (n) → T(n) = (n lg n)
• T(n) = 2T(n/2) + n → T(n) = (n lg n)
• T(n) = 2T(n/2+ 17) + n → (n lg n)
The Master Theorem
• Master Theorem gives us a cookbook for the
algorithm’s running time:

32
The Master Theorem
• if T(n) = aT(n/b) + f(n) then

 

 n( )
log b a
(
f ( n) = O n )
log b a − 

 
   0
(
T (n) =  n log b a log n ) (
f ( n) =  n )
log b a

  c 1
 
( f (n) ) (
f ( n) =  n )
log b a +
AND 
 
 af (n / b)  cf (n) for large n
33
Using The Master Method
• T(n) = 9T(n/3) + n
– a=9, b=3, f(n) = n
– nlog a = nlog 9 = (n2)
b 3

– Since f(n) = O(nlog 9 - ), where =1, case 1 applies:


3

( ) (
T (n) =  n log b a when f (n) = O n log b a − )
– Thus the solution is T(n) = (n2)

34
Summary
Time complexity analysis for

• Iterative algorithms
• Recursive algorithms

35
Review Questions
Consider the following pseudo-code.
x = 0;
for J = 1 to n
for K = J+1 to 3*n
x = x + 1;
Let T(n) be the total number of times the innermost statement (increment x) is
executed. Derive the EXACT value of T(n). Then express the result in O( ) form.

Consider the following recurrence relation, where n is a power of 2.

T(n) <= 0 when n=1


<= 2T(n/2) + log n when n>1
2T(n=2) + log n; n > 1:

Prove by induction that T(n) =An + B log n + C and determine the constants A,
B, C.

36

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy