0% found this document useful (0 votes)
2 views

Asymtotic Notations 1

The document provides an overview of asymptotic notations used in algorithm analysis, including Big O, Omega, and Theta notations, which represent upper, lower, and tight bounds of algorithm running times, respectively. It discusses the worst-case, average-case, and best-case analyses of algorithms like the sequential search and finding max/min in an array. Additionally, it differentiates between apriori and apostiori analyses, emphasizing the importance of asymptotic analysis for estimating algorithm performance on large inputs.

Uploaded by

Taha Aamir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Asymtotic Notations 1

The document provides an overview of asymptotic notations used in algorithm analysis, including Big O, Omega, and Theta notations, which represent upper, lower, and tight bounds of algorithm running times, respectively. It discusses the worst-case, average-case, and best-case analyses of algorithms like the sequential search and finding max/min in an array. Additionally, it differentiates between apriori and apostiori analyses, emphasizing the importance of asymptotic analysis for estimating algorithm performance on large inputs.

Uploaded by

Taha Aamir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

Asymptotic Notations:

Introductory Examples
Dr. Sania Bhatti
Sequential Search Algorithm
• Suppose we have an array of n elements [1: ], and a key element,
KEY.
Analysis

• Worst-case analysis: This determines the maximum amount of time the


algorithm would ever take. This analysis is usually easier than the average-
case analysis.

• Average-case analysis: This method determines the overall average


performance. It considers all possible cases, assigns a probability to each
case, and computes the weighted average (called expected value) for the
random variable.
• Best-case analysis: This determines the minimum amount of time the
algorithm would ever take.
Worst-Case Analysis of Sequential Search
• The worst-case number of key-comparison in this algorithm is
obviously . This happens if the key is found in the last position of the
array, or if it is not found anywhere.
• Since each iteration of the for-loop takes at most some constant
amount of time, C, then the total worst-case time of the algorithm is
( )≤ + .
• The constant represents the maximum amount of time for all
statements that are executed only once, independent of the variable
.) This total time is characterized as “order of” , denoted as O( ).
Average-Case Analysis of Sequential Search
• As a first estimate, one may think that since the worst-case number is
, and the best-case is 1 (found right away), then the average must be
about /2.
• First, as a quick review the “expected value”, suppose a random
variable has the possible values {1,2,3} with the following
probabilities.
Average-Case Analysis of Sequential Search
• Then the expected value of is 1 0.1+2 0.1+3 0.8= 2.7
• We may also refer to this as the weighted average. Note that a
straight average (when there is no probability involved) would be
simply (1+2+3)/3=2. Let
• = Probability that the key is found somewhere in the array
• = Probability that the key is found in position of the array, 1≤ ≤ .
• = / , .
• Finally, the probability that the key is not found is =1−
Average-Case Analysis of Sequential Search
• So, the expected number of key-comparisons in this algorithm is:

• In the special case when =1, the expected number of comparisons is


( +1)/2
Finding Max and Min of an Array
• The following pseudocode is a simple program loop that finds the
maximum and minimum elements in an array of elements, [1: ]. (Max
and Min are the returned parameters.)

• In iteration of the for-loop, [ ] is first compared against . If [ ] is


greater, then is updated. Otherwise, a second comparison is made
against , and if [ ] is smaller, then is updated.
Worst-Case and Best-Case Analysis
• In the worst-case, every iteration of the loop makes two comparisons.
(This happens if the first element of the array has the largest value.)
So the worst-case number of comparisons is 2( −1). In the best-case,
every iteration makes only one comparison, so the best-case number
of comparisons is ( −1). This happens if the input is in sorted order.
Asymptotic Complexity
• Suppose there are two algorithms for a problem of size n with the
running times, respectively
• 1( )=10 ,
• 2( )=2 2
Asymptotic Complexity
• The fact that 1 has a slower growth rate than 2 is due to the fact
that 1 is a linear function and 2 is a quadratic function 2. The
coefficients (also called constant factors) are not as critical in this
comparison.
• So, the asymptotic complexity definitions incorporate two issues:
• 1. Focus on large problem size (large n), and ignore small values of n.
• 2. Ignore the constant coefficients (constant factors).
ASYMPTOTIC ANALYSIS

 Expressing the complexity in term of its relationship to know


function. This type of analysis is called asymptotic analysis.
The asymptotic behavior of a function f(n) refers to the growth
of f(n) as n gets large.
We typically ignore small values of n , since we are usually
interested in estimating how slow the program will be on large
inputs.
A good rule of thumb is that the slower the asymptotic growth
rate, the better the algorithm.
However it’s not always true.

12
ASYMPTOTIC NOTATION

ASYMPTOTIC NOTATION: The mathematical way of


representing the Time complexity.
The notation we use to describe the asymptotic running time of
an algorithm are defined in terms of functions whose domains
are the set of natural numbers.

Asymptotic growth: The rate at which the function grows…


“growth rate” is the complexity of the function or the amount of
resource it takes up to compute.

Growth rate Time +memory

13
They are 3 asymptotic notations are mostly used to
represent time complexity of algorithm.

1.Big oh (O)notation
2.Big omega (Ω) notation
3.Theta(Θ) notation
4.Little oh notation
5.Little omega(Ω) notation

14
1.Big oh (O)notation
1.Big oh (O)notation (Asymptotic Upper Bound):
This notation mainly represents upper bound of the
algorithm run time.
Big oh (O)notation is useful to calculate the maximum
amount of time of execution.
By using Big-oh notation we have to calculate worst-
case time complexity.

15
1.Big oh (O)notation

 For a given function g (n) , we denote by O( g ( n)) the set


of functions
 f (n) : there exist positive constants c and n0 s.t.
O( g (n))   
 0  f ( n )  cg ( n ) for all n  n 0 

 We use O-notation to give an asymptotic upper bound of


a function, to within a constant factor.
 f (n)  O( g (n)) means that there existes some constant c
s.t. f (n) is always  cg (n) for large enough n.

16
17
Example
Example : f(n)=2n +3 & g(n)= n
Formula : f(n) ≤ c g(n) n ≥ n0 , c>0 ,n0 ≥ 1
f(n)=2n+3 & g(n)=n
Now 2n+3 ≤ c.n Let c=4
2n+3 ≤ 4.n
Put the value of n =1
5 ≤ 4 false
Put n=2 7 ≤ 8 true now n0>2 For all value of n>2 & c=4
now f(n) ≤ c.g(n)
2n+3 ≤ 4n for all values of n>2 The above condition is satisfied this notation
takes a maximum amount of time to execute.
It means that f(n)= Og(n). So that it is called worst-case complexity.
18
Example

19
1.Big oh (O)notation
Common Rules
The following two rules help simplify finding asymptotic upper bounds

20
2.Ω-Omega notation
Ω-Omega notation Asymptotic Lower Bound:
It represents the Lower bound of the algorithm run
time.
By using Big Omega notation we can calculate
minimum amount of time. We can say that it is best
case time complexity.

f(n)>=c g(n) n>=n0 , c>0 ,n0 >=1


where c is constant, n is function
Lower bound
Best case
21
Ω-Omega notation

 For a given function g (n), we denote by ( g ( n)) the set of


functions
 f (n) : there exist positive constants c and n0 s.t.
( g (n))   
 0  cg ( n )  f ( n ) for all n  n0 

 We use Ω-notation to give an asymptotic lower bound on a


function, to within a constant factor.
 f (n)  ( g (n)) means that there exists some constant c
s.t.
f (n) is always  cg (n) for large enough n.

22
23
Examples
Example : f(n)=2n +3
Formula : f(n) ≥ c g(n) n>=n0 , c>0 ,n0 >=1
f(n)=2n+3
2n+3 ≥ 1*n, c=1 put the value of n=1
5 ≥ 1 true n0 ≥ 1 for all value of n
It means that f(n)= Ω g(n).

24
Example
3. -Theta notation
Theta (Θ) notation Tight Bound:

 It represents the average bond of an algorithm running time.


 By using theta notation we can calculate average amount of time.
 So it called the average case time complexity of an algorithm.
c1 g(n)<=f(n)<=c2 g(n)

where c is constant
Average bound

26
 Theta notation
 For a given function f (n) , we denote by ( g (n)) the set
of functions

 f (n) : there exist positive constants c1, c2 , and n0 s.t.


( g (n))   
 0  c1 g ( n )  f ( n )  c 2 g ( n ) for all n  n0 

 A function f (n) belongs to the set ( g (n)) if there exist


positive constants c1 and c2 such that it can be “sand-
wiched” between c1 g (n) and c2 g (n) or sufficienly large n.
 f (n)  ( g (n)) means that there exists some constant c1
and c2 s.t. c1 g (n)  f (n)  c 2 g (n) for large enough n.

27
28
Example
Example : f(n)=2n+3
Formula : c1 g(n) ≤ f(n) ≤ c2 g(n)
f(n)=2n+3
1*n ≤ 2n+3 ≤ 4*n now put the value of n=1
we get 1 ≤ 5 ≤ 4 false
n=2 we get 2 ≤ 7 ≤ 8 true
n=3 we get 3 ≤ 9 ≤ 12 true
It means that f(n)=  g(n).
Now all values of n ≥ 2 it is true above condition is satisfied.

29
Example
)
+….+1=n

We proved 2 /2 2 2
A nested loop analysis
4.Little oh notation

• Little o notation is used to describe an upper bound that


cannot be tight. In other words, loose upper bound of
f(n).
Slower growth rate
f(n) grows slower than g(n)
• We formally define o(g(n)) (little-oh of g of n) as the set
of f(n) = o(g(n)) for any positive constant c > 0, there
exists a value n0 > 0 such that 0 ≤ f(n) ≤ c.g(n).

35
Examples

Using mathematical relation, we can say that f(n) = o(g(n))


means,
if

Example on little o asymptotic notation:

1.If f(n) = n2 and g(n) = n3 then check whether


f(n) = o(g(n)) or not.

36
Examples

Sol:

The result is 0, and it satisfies the equation mentioned above. So we


can say that f(n) = o(g(n)).

37
Examples

38
5.Little omega(ω) notation
•Another asymptotic notation is little omega notation. it is denoted
by (ω).
•Little omega (ω) notation is used to describe a loose lower bound
of f(n).
•Faster growth rate
•F(n) grows faster than g(n)
•The relation f(n)=ωg(n) implies if

39
Example

40
Example of asymptotic notation

Problem:-Find upper bound ,lower bound & tight bound range for
functions: f(n)= 2n+5
Solution:-Let us given that f(n)= 2n+5 , now g(n)= n
lower bound=2n, upper bound =3n, tight bound=2n
For Big –oh notation(O):- according to definition
f(n) ≤ cg(n) for Big oh we use upper bound so
f(n)=2n+5, g(n)=n and c=3 according to definition
2n+5 ≤ 3n
Put n=1 7 ≤ 3 false Put n=2 9 ≤ 6 false Put n=3 14 ≤ 9 false Put
n=4 13 ≤ 12 false Put n=5 15 ≤ 15 true
now for all value of n ≥ 5 above condition is satisfied. C=3 n ≥ 5
41
2. Big - omega notation :- f(n) ≥ c.g(n) we know that
this
Notation is lower bond notation so c=2
Let f(n)=2n+5 & g(n)=2.n
Now 2n+5 ≥ c.g(n);
2n+5 ≥ 2n put n=1
We get 7 ≥ 2 true for all value of n ≥ 1,c=2 condition is
satisfied.
3. Theta notation :- according to definition
c1.g(n) ≤ f(n) ≤ c2.g(n) 42
Common Asymptotic notations

43
Apriori Analysis

• Apriori analysis means, analysis is performed prior to running it on a


specific system. This analysis is a stage where a function is defined
using some theoretical model. Hence, we determine the time and
space complexity of an algorithm by just looking at the algorithm
rather than running it on a particular system with a different memory,
processor, and compiler.

44
Apostiari Analysis
• Apostiari analysis of an algorithm means we perform analysis of
an algorithm only after running it on a system. It directly
depends on the system and changes from system to system.
• In an industry, we cannot perform Apostiari analysis as the
software is generally made for an anonymous user, which runs it
on a system different from those present in the industry.
• In Apriori, it is the reason that we use asymptotic notations to
determine time and space complexity as they change from
computer to computer; however, asymptotically they are the
same.

45

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy