Analysis of Algorithms and Asymptotics

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Analysis of Algorithms and

Asymptotics

CS 4231, Fall 2012 Mihalis Yannakakis

Analysis of Algorithms
• Correctness:
The algorithm terminates with the correct answer

• Performance
– Mainly Running time (Time complexity)
– Use of other resources (space, …)

• Experimental vs. analytical evaluation of


algorithms
• Other issues: simplicity, extensibility, …

1
Time Complexity

• Running time depends on the input


• Parameterize by the size n of the input, and
express complexity as function T(n)
Worst Case: maximum time over all inputs of
size n
Average Case: expected time, assuming a
probability distribution over inputs of size n

Analysis
Cost of each operation depends on machine
Simplification 1: machine-independent analysis:
assume all operations unit cost →
can add the costs of the different steps

2
Asymptotic Analysis
Simplification 2: Look at growth of T(n) as
n goes to infinity; focus on dominant term
- Example: 3n² +7n +10
Dominant term: 3n²

• Simplification 3: Look at the rate (order) of


growth: suppress the constant coefficient
- Example: Quadratic complexity n²)

Benefits of asymptotic analysis

• Machine independence – intrinsic


complexity of algorithms
• Abstraction from details, concentrate on
dominant factors
• A linear-time algorithm becomes faster
than a quadratic algorithm eventually (for
large enough n)

3
But .. caution:

• Eventually may be too late, if the constant of the


linear-time algorithm that we ignored is huge,
eg. 10 9 n  n 2 for n  10 9

• Some operations may be much more costly than


others, and we may want to count them separately
(for example, comparisons in sorting of complex
objects)

Asymptotic Notations:
Theta, Big-Oh, Omega
Theta: ( g ( n))  { f ( n) |  constants c1 , c2  0 and n0 s.t.  n  n0 :
c1 g ( n)  f ( n)  c2 g ( n) }

Convention : We usually wr ite f ( n)   ( g ( n))


Caution: = here denotes membership, not equality

c2 g ( n) f(n)

c1 g ( n)

n0 n

4
Asymptotic Notations:
Theta, Big-Oh, Omega
Big-Oh: O( g (n))  { f (n) |  constant c  0 and n0 s.t.  n  n0 :
(Order) 0  f ( n )  c g ( n) }

Convention : We usually write f (n)  O( g (n))

Example:
f(n) 5n  O(n 2 )
cg(n)
but not vice-versa

n0 n

Asymptotic Notations:
Theta, Big-Oh, Omega

Omega: ( g (n))  { f (n) |  constant c  0 and n0 s.t.  n  n0 :


c g ( n)  f ( n) }

Convention : We usually write f (n)  ( g (n))

Example:
f(n)
5n 2   ( n )
c g(n) but not vice-versa

n0 n

5
Asymptotic Notations:
little-oh, little-omega
little-oh: o ( g ( n ))  { f ( n ) |  constant c  0  n 0 s.t.  n  n 0 :
0  f (n)  c g (n) }

little-omega:  ( g ( n ))  { f ( n ) |  constant c  0  n 0 s.t.  n  n 0 :


0  c g (n)  f (n) }

f(n)=o(g(n) means that for large n, function f is smaller


than any constant fraction of g
f(n)=(g(n) means that for large n, function f is larger than
any constant multiple of g, i.e., g=o(f(n))

Example: 5n  o(n 2 ), 5n 2   (n)

Asymptotic Notations Summary

Notation Ratio f(n)/g(n) for large n

f (n)  ( g(n)) f (n) / g(n)  

f (n)  (g(n)) c  f (n) / g(n)


f (n)  (g(n)) c1  f (n) / g(n)  c2

f (n)  O(g(n)) f (n) / g(n)  c


f (n)  o(g(n)) f (n) / g(n)  0

6
Example: Polynomials

• Polynomial: ad n d  ad 1n d 1    a1n  a0 , where ad  0


 ( n d )
Ex : 5n 3  4n 2  3n  8  ( n 3 )
f (n) a a
Proof: d
 ad  d 1    d0  ad  0    0  ad
n n n

(0  )c  d  n c  o( n d )
Ex : n 3.2  o( n 3.3 )

Proof: nc 1
d
 d c  0
n n

Example: logarithms
• log10 n = log2 n )
• Proof: log10n = log2n / log210 = log2n / 3.32

• Same for any change of logarithm from one constant


base a to another base b: logan = (logbn)

• Notation: logn for log2n ; ln n for logen (natural log)

7
Logs vs. powers/roots
• logn = o(nc) for all c>0

• For example: log n  o(n 0.4 ); log n  o( 20 n )

• Proof: Use L’Hopital’s rule

1
ln n 1
lim c  lim cn1  lim c  0
n  n n   cn n   cn

Some common functions

n  n log n  n 2  n 3    2 n  3 n  n !

polynomial exponential

8
Properties
f (n)  o( g (n))  f (n)  O( g (n))
f (n)   ( g (n))  f (n)  ( g (n))
f (n)  ( g (n))  f (n)  O( g (n)), f (n)  ( g (n))
f (n)  ( g (n))  f (n)  O( g (n)), f (n)  ( g (n))

Transitivity:
f  O( g ) and g  O(h)  f  O(h)
same for o,  , , 

Sum: f+g = (max(f,g))

Asymptotic notation in equations


f ( n )  3n 2  O ( n ) means
f ( n )  3n 2  h ( n ) for some function h ( n ) that is O ( n )

Can write equations like


3n 3  O ( n 2 )  O ( n )  O (1)   ( n 3 )

Caution: O(1)+O(1)+…+O(1) (n times) is not O(1)


O(n) + n) = ? It is not n)
…..

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy