Algorithms: Complexity of Recursive Algorithms

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Algorithms

complexity of recursive algorithms

Jiří Vyskočil, Marko Genyk-Berezovskyj


2010-2014
Recurrences

 A recurrence is an equation or inequality that describes a function in


terms of its value on smaller inputs. For example

Where T n is the overall complexity of the algorithm.


The complexities for different values of n are listed on the right
hand side.
 Marginal cases (n < constant) can be neglected, because the
complexity of an algorithm in such cases is also constant. Often it
happens that rounding the values does not change the results too.
The given recurrence can be simplified to:

Algorithms
2 / 14
Simplifying recurrences
 We want to obtain a formula with no recurrence.
 Example: T(n) = (log(n))

 Methods:
 Substitution method
 First, guess the solution and then prove its correctness by induction.
 Recursion-tree method
 Evaluate characteristics of the recursion tree.
 Use the „cookbook“ - Master theorem
 Some common forms of recurrences are solved in general by the
Master theorem, we only have to evaluate the conditions of the
theorem.

Algorithms
3 / 14
Substitution method
 Two-steps solution:
1. Guess the exact form of the solution.
Typically, one would precompute a set of numerical
results for different input n's and derive a formula.
2. Prove the correctness of the guess.
 Mathematical induction is the tool of choice in most
cases.

 The method is very effective provided we can complete the


step 1 correctly. On the other hand, there are no general
rules how to deal with it.

Algorithms
4 / 14
Substitution method - Example
 Example:
T (n) = 2T (n/2) + n
 Suppose, we come (somehow) to a guess:
T (n) = O(n log(n))
 Using the definition of upper bound "O" we want to prove:
T (n) ≤ cn log (n)
for some suitable c > 0.
 State an induction hypothesis (i.e. let the guess be true
for n/2 ): T (n/2) ≤ c (n/2) log (n/2)

Algorithms
5 / 14
Substitution method - Example
 Substitute the hypothesis into the original recurrence
T (n) = 2T (n/2) + n and complete the proof in standard manner as
you would do in any other induction proof.
T(n) ≤ 2(c (n/2) log(n/2)) + n
≤ cn log(n/2) + n // due to the hypothesis
= cn log (n) ‐ cn log (2) + n
= cn log (n) ‐ cn + n
≤ cn log (n)

The last inequality holds for c ≥ 1.


 The induction base case holds trivially: All what is needed is to show
that for some values of n0 and c > 0 the base case holds.
Choose for example n0=3 and c ≥ 2.
 The proof is complete, the substitution method has yielded a result.

Algorithms
6 / 14
Recursion-tree method - Example
 Recurrence:
T (n) = 3T (n/4) + cn2
 Iteratively build more and more complete recursion trees:

 Recursion tree visualizes the recursive process, a node


represents a subprocess and it is labeled by its complexity. The
sum of all labels must be the equal to the complexity T (n)
specified in the given recurrence.
Algorithms
7 / 14
Recursion-tree method - Example
 The resulting tree looks like:

 Sums of the labels in particular tree depths are listed to the right.
By adding sums in all levels we obtain the resulting complexity: O (n2)
Algorithms
8 / 14
Recursion-tree method - Example
 Adding the sums in particular depths might be done as follows:

use formula

for |x| < 1

Algorithms
9 / 14
Using the "cookbook"
 Master theorem is applicable to the recurrences of the form:

T (n) = a T (n/b) + f (n)

Where a ≥ 1 and b > 1 are constants


and f (n) is asymptotically positive function.

 Rounding the term T (n/b) to either T ( n/b ) or T ( n/b does not


affect the resulting complexity in this case.

Algorithms
10 / 14
Using the "cookbook"
 Master theorem
 Let a ≥ 1, b > 1 be constants, let f (n) be a function and let T (n) be
defined for non-negative integers by the recurrence
T (n) = a T (n/b) + f (n)
where n/b means n/b or n/b . Then the following holds.

1. If f (n) ∈ O(nlogb(a)‐ε) for some constant ε > 0, then


T (n) ∈ Θ(nlogb(a)).

2. If f (n) ∈ Θ(nlogb(a)), then


T (n) ∈ Θ(nlogb(a) log(n)).

3. If f (n) ∈ Ω(nlogb(a)+ε ) for some constant ε > 0 and if


a∙f(n/b) ≤ c∙f(n) for some constant c < 1 and all sufficiently big n,
then
T (n) ∈ Θ(f(n)).

Algorithms
11 / 14
Using the "cookbook" Example 1
 Example 1:
T (n) = 9T(n/3) + n
 The parameters are: a = 9, b = 3, f (n) = n ∈ O(nlog3(9)‐1).
It is the case 1 of Master theorem.
 The resulting complexity is thus:
T (n) ∈ Θ(nlog3(9)) = Θ(n2).

Algorithms
12 / 14
Using the "cookbook" Example 2
 Example 2:
T (n) = T(2n/3) + 1
 The parameters are: a = 1, b = 3/2,
 f (n) = 1 = nlog3/2(1) ∈ Θ(nlog3/2(1)) .
It is the case 2 of Master theorem.
 The resulting complexity is thus:
T (n) ∈ Θ(nlog3/2(1) log(n)) = Θ(log(n))

Algorithms
13 / 14
Using the "cookbook" Example 3
 Example 3: T (n) = 3T(n/4) + n log(n)
 The parameters are: a = 3, b = 4,

f (n) = n log(n) and we know that nlog4(3) = O(n0.793) .


Therefore, we can state: f (n) ∈ Ω(nlog4(3)+0.2).
To satisfy conditions of case 3 it must hold for all c < 1 and
all sufficiently big n that a f (n/b) ≤ c f (n). It really does:
a f (n/b) = 3(n/4)log(n/4) ≤ (3/4)n log(n) = c f (n), for c = ¾.
 The resulting complexity is thus:
T (n) ∈ Θ(n log(n))

Algorithms
14 / 14

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy