1 Asymptotic Notation Pptx
1 Asymptotic Notation Pptx
1 Asymptotic Notation Pptx
Algorithms
Asymptotic Notation
Analyzing Algorithms
• Predict the amount of resources required:
• memory: how much space is needed?
• computational time: how fast the algorithm runs?
3
Typical Running Time Functions
• 1 (constant running time):
– Instructions are executed once or a few times
• logN (logarithmic)
– A big problem is solved by cutting the original problem in smaller
sizes, by a constant fraction at each step
• N (linear)
– A small amount of processing is done on each input element
• N logN
– A problem is solved by dividing it into smaller problems, solving
them independently and combining the solution
4
Typical Running Time Functions
• N2 (quadratic)
– Typical for algorithms that process all pairs of data items (double
nested loops)
• N3 (cubic)
– Processing of triples of data (triple nested loops)
• NK (polynomial)
• 2N (exponential)
– Few exponential algorithms are appropriate for practical use
5
Growth of Functions
6
Complexity Graphs
log(n)
Complexity Graphs
n log(n)
log(n)
Complexity Graphs
n10 n3
n2
n log(n)
Complexity Graphs (log scale)
3n
nn
n20
2n
n10
1.1n
Algorithm Complexity
• Worst Case Complexity:
– the function defined by the maximum number of steps
taken on any instance of size n
• Best Case Complexity:
– the function defined by the minimum number of steps
taken on any instance of size n
• Average Case Complexity:
– the function defined by the average number of steps
taken on any instance of size n
Best, Worst, and Average Case Complexity
Average Case
Complexity
Best Case
Complexity
N
(input size)
Doing the Analysis
• It’s hard to estimate the running time exactly
– Best case depends on the input
– Average case is difficult to compute
– So we usually focus on worst case analysis
• Easier to compute
• Usually close to the actual running time
• Strategy: find a function (an equation) that, for large n, is an
upper bound to the actual function (actual number of steps,
memory usage, etc.)
Upper bound
Actual function
Lower bound
Motivation for Asymptotic Analysis
• An exact computation of worst-case running time
can be difficult
– Function may have many terms:
• 4n2 - 3n log n + 17.5 n - 43 n⅔ + 75
• An exact computation of worst-case running time
is unnecessary
– Remember that we are already approximating running
time by using RAM model
Classifying functions by their
Asymptotic Growth Rates (1/2)
• asymptotic growth rate, asymptotic order, or
order of functions
– Comparing and classifying functions that ignores
• constant factors and
• small inputs.
• The Sets big oh O(g), big theta (g), big omega
(g)
Classifying functions by their
Asymptotic Growth Rates (2/2)
• O(g(n)), Big-Oh of g of n, the Asymptotic Upper
Bound;
• Q(g(n)), Theta of g of n, the Asymptotic Tight
Bound; and
• W(g(n)), Omega of g of n, the Asymptotic Lower
Bound.
Big-O
f n Og n : there exist positive constants c and n0 such that
0 f n cg n for all n n0
• What does it mean?
– If f(n) = O(n2), then:
• f(n) can be larger than n2 sometimes, but…
• We can choose some constant c and some value n0 such
that for every value of n larger than n0 : f(n) < cn2
• That is, for values larger than n0, f(n) is never more than a
constant multiplier greater than n2
• Or, in other words, f(n) does not grow more than a constant
factor faster than n2.
17
Visualization of O(g(n))
cg(n)
f(n)
n0
18
Examples
– 2n2 = O(n2n
3 2 ≤ cn3 2 ≤ cn c = 1 and n = 2
): 0
– 1000n2+1000n = O(n2):
n2≤
– n = O(n ): cn 2
cn ≥ 1 c = 1 and n0= 1
19
Big-O
2
2n O n
2
2
1,000,000n 150,000 O n
2
2
5n 7 n 20 O n 2
3
2n 2 O n 2
n 2.1
O n
2
20
More Big-O
• Prove that: 20n 2 2n 5 O n 2
• Let c = 21 and n0 = 4
• 21n2 > 20n2 + 2n + 5 for all n > 4
n2 > 2n + 5 for all n > 4
TRUE
21
Tight bounds
• We generally want the tightest bound we can
find.
• While it is true that n2 + 7n is in O(n3), it is more
interesting to say that it is in O(n2)
22
Big Omega – Notation
• () – A lower bound
f n g n : there exist positive constants c and n0 such that
0 f n cg n for all n n0
– n2 = (n)
– Let c = 1, n0 = 2
– For all n 2, n2 > 1 n
23
Visualization of (g(n))
f(n)
cg(n)
n0
24
-notation
25
Visualization of (g(n))
c2g(n)
f(n)
c1g(n)
n0
26
A Few More Examples
• n = O(n2) ≠ (n2)
• 200n2 = O(n2) = (n2)
• n2.5 ≠ O(n2) ≠ (n2)
27
Example 2
• Prove that: 3
20n 7 n 1000 n 3
• Let c = 21 and n0 = 10
• 21n3 > 20n3 + 7n + 1000 for all n > 10
n3 > 7n + 5 for all n > 10
TRUE, but we also need…
• Let c = 20 and n0 = 1
• 20n3 < 20n3 + 7n + 1000 for all n 1
TRUE
28
Example 3
2 n 2 O2 n
• Show that n
• Let c = 2 and n0 = 5
2 2 n 2 n n 2
2 n 1 2 n n 2
2 n 1 2 n n 2
2 n 2 1 n 2
n 2
2 n n 5
29
Asymptotic Notations - Examples
• notation
– n2/2 – n/2 = (n2)
– (6n3 + 1)lgn/(n + =
1)(n2lgn)
– n vs. n2 n ≠ (n2)
• notation • O notation
– n3 vs. n2 n3 = (n2) – 2n2 vs. n3 2n2 = O(n3)
– n vs. logn n = (logn) – n2 vs. n2 n2 = O(n2)
– n vs. n2 n (n2) – n3 vs. nlognn3 O(nlgn)
30
Asymptotic Notations - Examples
• For each of the following pairs of functions, either f(n) is
O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine
which relationship is correct.
– f(n) = log n2; g(n) = log n + 5 f(n) = (g(n))
– f(n) = n; g(n) = log n2 f(n) = (g(n))
– f(n) = log log n; g(n) = log n f(n) = O(g(n))
– f(n) = n; g(n) = log2 n f(n) = (g(n))
– f(n) = n log n + n; g(n) = log nf(n) = (g(n))
– f(n) = 10; g(n) = log 10 f(n) = (g(n))
– f(n) = 2n; g(n) = 10n2 f(n) = (g(n))
– f(n) = 2n; g(n) = 3n f(n) = O(g(n))
31
Simplifying Assumptions
• 1. If f(n) = O(g(n)) and g(n) = O(h(n)), then f(n) = O(h(n))
• 2. If f(n) = O(kg(n)) for any k > 0, then f(n) = O(g(n))
• 3. If f1(n) = O(g1(n)) and f2(n) = O(g2(n)),
• then f1(n) + f2(n) = O(max (g1(n), g2(n)))
• 4. If f1(n) = O(g1(n)) and f2(n) = O(g2(n)),
• then f1(n) * f2(n) = O(g1(n) * g2(n))
32
Example #1: carry n items
from one room to another room
• How many operations?
• n pick-ups, n forward moves, n drops and n
reverse moves 4 n operations
• 4n operations = c. n = O(c. n) = O(n)
• Similarly, any program that reads n inputs from
the user will have minimum time complexity
O(n).
Example #2: Locating patient record in
Doctor Office
36
Sequential Search
• Given an unsorted vector/list a[ ], find the
location of element X.
37
If-then-else Statement
if(condition)
i = 0;
else
for ( j = 0; j < n; j++)
a[j] = j;
• Complexity = ??
= O(1) + max ( O(1), O(N))
= O(1) + O(N)
= O(N)
38
Consecutive Statements
• Add the complexity of consecutive statements
39
Nested Loop Statements
• Analyze such statements inside out
40
Recursion
long factorial( int n )
{ In terms of big-Oh:
if( n <= 1 ) t(1) = 1
return 1; t(n) = 1 + t(n-1) = 1 + 1 + t(n-2
else = ... k + t(n-k)
return n*factorial(n- 1); Choose k = n-1
}
t(n) = n-1 + t(1) = n-1 + 1 = O(
41
Binary Search
• Given a sorted vector/list a[ ], find the location of element X
42
Example
• Code:
• a = b;
• Complexity:
43
Example
• Code:
• sum = 0;
• for (i=1; i <=n; i++)
• sum += n;
• Complexity:
44
Example
• Code:
• sum = 0;
• for (j=1; j<=n; j++)
• for (i=1; i<=j; i++)
• sum++;
• for (k=0; k<n; k++)
• A[k] = k;
• Complexity:
45
Example
• Code:
• sum1 = 0;
• for (i=1; i<=n; i++)
• for (j=1; j<=n; j++)
• sum1++;
• Complexity:
46
Example
• Code:
• sum2 = 0;
• for (i=1; i<=n; i++)
• for (j=1; j<=i; j++)
• sum2++;
• Complexity:
47
Example
• Code:
• sum1 = 0;
• for (k=1; k<=n; k*=2)
• for (j=1; j<=n; j++)
• sum1++;
• Complexity:
48
Example
• Code:
• sum2 = 0;
• for (k=1; k<=n; k*=2)
• for (j=1; j<=k; j++)
• sum2++;
• Complexity:
49
Summary
• Time complexity is a measure of algorithm
efficiency
• Efficient algorithm plays the major role in
determining the running time.
Q: Is it possible to determine running time based on
algorithm’s time complexity alone?
• Minor tweaks in the code can cut down the
running time by a factor too.
• Other items like CPU speed, memory speed,
device I/O speed can help as well.
• For certain problems, it is possible to allocate
additional space & improve time complexity.
Summary
• Time complexity is a measure of algorithm
efficiency
• Efficient algorithm plays the major role in
determining the running time.
Q: Is it possible to determine running time based on
algorithm’s time complexity alone?
• Minor tweaks in the code can cut down the
running time by a factor too.
• Other items like CPU speed, memory speed,
device I/O speed can help as well.
• For certain problems, it is possible to allocate
additional space & improve time complexity.