Asymptotic notation
Asymptotic notation
• Recursive
Function calls itself
factorial(n) =
1 if n=0
n x factorial(n-1) if n>0
Recursion
• To see how the recursion works, let’s break
down the factorial function to solve
factorial(3)
Breakdown
}
Statements S/E Freq. Total
2 { 0 - 0
3 S = 0.0; 1 1 1
5 s = s+a[i]; 1 n n
6 return s; 1 1 1
7 } 0 - 0
2n+3
Statements S/E Freq. Total
2 { 0 - 0
5 s = s+a[i][j]; 1 nm nm
6 return s; 1 1 1
7 } 0 - 0
2nm+2n+2
#include<stdio.h>
{
int a=4;
Int b=6;
Int c;
C=a+b;
printf(“%d” , c);
}
int i, j, n=8;
For (i=1; i<=n; i++)
{ for (j=1 ; j<=n ; j++)
{printf(“DAA”);
}}}
• Example: f(n) = 10n2+4n+2
is O(n2) because 10n2+4n+2 <= 11n2
for all n >=5.
• Example: f(n) = 6*2n+n2 is O(2n)
because 6*2n+n2 <=7*2n for all n>=4.
• Algorithms can be: O(1) constant;
• O(log n) logarithmic;
• O(nlogn); O(n) linear;
• O(n2) quadratic;
• O(n3) cubic;
• O(2n) exponential.
Some results
Sum of two functions: If f(n) = f1(n) + f2(n), and
f1(n) is O(g1(n)) and f2(n) is O(g2(n)), then f(n) =
O(max(|g1(n)|, |g2(n)|)).
• The Big O notation is useful when we only have upper bound on time
complexity of an algorithm. Many times we easily find an upper bound
by simply looking at the algorithm.
• O(g(n)) = { f(n): there exist positive constants c
and n0 such that 0 <= f(n) <= c*g(n)
for all n >= n0
}
• Example: f(n) = 3n+2 is O(n) because 3n+2 <= 4n for all n
>= 2. c = 4, n0 = 2. Here g(n) = n.
• F(n) = 3n + 3 = O (n) as 3n+3 <= 4n for all n>=3.
• F(n) = 100n + 6 = O(n) as 100n + 6 <=101n for all n>=6.
• F(n) = 10n^2+4n+2 <=11n^2
• Suppose f(n) = 5n and g(n) = n. • To show that f = O(g), we have to
show the existence of a constant C as given earlier. Clearly 5 is such a
constant so f(n) = 5 * g(n). • We could choose a larger C such as 6,
because the definition states that f(n) must be less than or equal to CS
4407, Algorithms , University College Cork,, Gregory M. Provan C *
g(n), but we usually try and find the smallest one. Therefore, a
constant C exists (we only need one) and f = O(g).
• In the previous timing analysis, we ended up with T(n) = 4n + 5, and
we concluded intuitively that T(n) = O(n) because the running time
grows linearly as n grows. Now, however, we can prove it
mathematically: To show that f(n) = 4n + 5 = O(n), we need to produce
a constant C such that: f(n) <= C * n for all n. If we try C = 4, this
doesn't work because 4n + 5 is not less than 4n. We need C to be at
least 9 to cover all n. If n = 1, C has to be 9, but C can be smaller CS
4407, Algorithms , University College Cork,, Gregory M. Provan for
greater values of n (if n = 100, C can be 5). Since the chosen C must
work for all n, we must use 9: 4n + 5 <= 4n + 5n = 9n Since we have
produced a constant C that works for all n, we can conclude: T(4n + 5)
= O(n)
Say f(n) = n2: We will prove that f(n) ≠ O(n). • To do this, we must show
that there cannot exist a constant C that satisfies the big-Oh definition.
We will prove this by contradiction. Suppose there is a constant C that
works; then, by the definition of big-Oh: n2 ≤ C * n for all n. • Suppose n
is any positive real number greater than C, CS 4407, Algorithms ,
University College Cork,, Gregory M. Provan then: n * n > C * n, or n2 > C
* n. So there exists a real number n such that n2 > C * n. This contradicts
the supposition, so the supposition is false. There is no C that can work
for all n: f(n) ≠ O(n) when f(n) = n2
Suppose f(n) = n2 + 3n - 1. We want to show that f(n) = O(n2). f(n) = n2 +
3n - 1 < n2 + 3n (subtraction makes things smaller so drop it) <= n2 + 3n2
(since n <= n2 for all integers n) = 4n2 Therefore, if C = 4, we have shown
that f(n) = O(n2). Notice that all we are doing is finding a simple function
that is an upper bound on the original function. Because of this, we
could also say that CS 4407, Algorithms , University College Cork,,
Gregory M. Provan This would be a much weaker description, but it is
still valid. f(n) = O(n3) since (n3) is an upper bound on n2
• Ω Notation: Just as Big O notation provides an
asymptotic upper bound on a function, Ω
notation provides an asymptotic lower bound.
• Ω Notation : can be useful when we have lower
bound on time complexity of an algorithm.
• The
best case performance of an algorithm is generall
y not useful
, the Omega notation is the least used notation
among all three.
• For a given function g(n), we denote by Ω(g(n))
the set of functions.
• Ω (g(n)) = { f(n) : there exist positive constants c
and n0 such that 0 <= c*g(n) <= f(n)
for all n >= n0
}.
• Definition (Theta) : Consider a function f(n) which is non-
negative for all integers . We say that ``f(n) is theta g(n),''
which we write , if and only if f(n) is O(g(n)) and f(n) is .
• Definition (Little Oh) : Consider a function f(n) which is
non-negative for all integers. We say that ``f(n) is little
oh g(n),'' which we write f(n)=o(g(n)), if and only if f(n)
is O(g(n)) but f(n) is not.
• Little oh notation represents a kind of loose asymptotic
bound in the sense that if we are given that f(n)=o(g(n)),
then we know that g(n) is an asymptotic upper bound
since f(n)=O(g(n)), but g(n) is not an asymptotic lower
bound since f(n)=O(g(n)) and implies that.
• For example, consider the function f(n)=n+1.
• what c we choose, for large enough n, . Thus, we may
write .
• Suppose f(n) = 5n and g(n) = n. • To show that
f = O(g),
• we have to show the existence of a constant C
as given earlier.
• Clearly 5 is such a constant so f(n) = 5 * g(n).
• We could choose a larger C such as 6, because
the definition states that f(n) must be less
than or equal to C * g(n), but we usually try
and find the smallest one. Therefore, a
constant C exists (we only need one) and f =
O(g).
Algorithm Time Complexity Space Complexity