Data - Structure 2
Data - Structure 2
1
• Algorithm analysis refers to the process of
determining how much computing time and
storage that algorithms will require.
2
• In order to solve a problem, there are many
possible algorithms.
Note:
Running time is the most important since
computational time is the most precious
resource in most problem domains.
4
• There are two approaches to measure the efficiency
of algorithms:
1. Empirical
based on the total running time of the
program.
Uses actual system clock time.
Example:
t1
for(int i=0; i<=10; i++)
cout<<i;
t2
Running time taken by the above algorithm (TotalTime)
5
= t2-t1;
• It is difficult to determine efficiency of algorithms
using this approach, because clock-time can vary
based on many factors.
For example:
a) Processor speed of the computer
1.78GHz 2.12GHz
10s <10s
b) Current processor load
• Only the work 10s
• With printing 15s
• With printing & browsing the internet >15s
6
c) Specific data for a particular run of the
program
• Input size
• Input properties
t1
for(int i=0; i<=n; i++)
cout<<i;
t2
T=t2-t1;
For n=100, T>=0.5s
n=1000, T>0.5s 7
d) Operating System
• Multitasking Vs Single tasking
• Internal structure
8
2. Theoretical
9
We use theoretical approach to determine
the efficiency of algorithm because:
Time units
Operations performed, or
The amount of storage space required.
11
• Two important ways to characterize the
effectiveness of an algorithm are its Space
Complexity and Time Complexity.
Analysis Rules:
1. Assume an arbitrary time unit.
2. Execution of one of the following operations
takes time 1 unit:
Assignment Operation
Example: i=0;
Single Input/Output Operation
Example: cin>>a;
cout<<“hello”; 15
Single Boolean Operations
Example: i>=10
Single Arithmetic Operations
Example: a+b;
Function Return
Example: return sum;
3. Running time of a selection statement (if,
switch) is the time for the condition
evaluation plus the maximum of the
running times for the individual clauses in
the selection.
16
Example: int x;
int sum=0;
if(a>b)
{
sum= a+b;
cout<<sum;
}
else
{
cout<<b;
}
T(n) = 1 +1+max(3,1)
=5
17
4. Loop statements:
• The running time for the statements inside the loop
* number of iterations + time for setup(1) + time for
checking (number of iteration + 1) + time for update
(number of iteration)
• The total running time of statements inside a group
of nested loops is the running time of the
statements * the product of the sizes of all the
loops.
• For nested loops, analyze inside out.
• Always assume that the loop executes the
maximum number of iterations possible. (Why?)
Because we are interested in the worst case
complexity.
18
5. Function call:
• 1 for setup + the time for any parameter
calculations + the time required for the execution of
the function body.
Examples:
1)
int k=0,n;
cout<<“Enter an integer”;
cin>>n
for(int i=0;i<n; i++)
k++;
T(n)= 3+1+n+1+n+n=3n+5
19
2)
int i=0;
while(i<n)
{
cout<<i;
i++;
}
int j=1;
while(j<=10)
{
cout<<j;
j++;
}
T(n)=1+n+1+n+n+1+11+2(10)
= 3n+34
20
3)
int k=0;
for(int i=1 ; i<=n; i++)
for( int j=1; j<=n; j++)
k++;
T(n)=1+1+(n+1)+n+n(1+(n+1)+n+n)
= 2n+3+n(3n+2)
= 2n+3+3n2+2n
= 3n2+4n+3
21
4). int sum=0;
for(i=1;i<=n;i++))
sum=sum+i;
T(n)=1+1+(n+1)+n+(1+1)n
=3+4n=O(n)
22
6). void func( ){
int x=0; int i=0; int j=1;
cout<<”Enter a number”;
cin>>n;
while(i<n){
i=i+1;
}
while(j<n){
j=j+1;
}
}
T(n)=1+1+1+1+1+n+1+2n+n+2(n-1)
= 6+4n+2n-2
=4+6n=O(n)
23
7). int sum(int n){
int s=0;
for(int i=1;i<=n;i++)
s=s+(i*i*i*i);
return s;
}
T(n)=1+(1+n+1+n+5n)+1
=7n+4=O(n)
8). int sum=0;
for(i=0;i<n;i++)
for(j=0;j<n;j++)
sum++;
T(n)=1+1+(n+1)+n+n*(1+(n+1)+n+n)
=3+2n+n2+2n+2n2
=3+2n+3n2+2n
=3n2+4n+3=O(n2)
24
Formal Approach to Analysis
• In the above examples we have seen that
analyzing Loop statements is so complex.
f
or(
in
ti=1
;i<
=N;i
++){
s
um =
su
m +
i
;
1N
}
i1
26
Nested Loops: Formally
• Nested for loops translate into multiple
summations, one for each For loop.
}
sum= sum+i+j; 2 2M 2MN
i1 j1 i1
}
27
Consecutive Statements: Formally
• Add the running times of the separate blocks
of your code.
for (int i = 1; i <= N; i++) {
sum = sum+i;
}
N N N
if (test ==1) {
for (inti =1; i <=N; i++) { N N N
sum=sum+i; 1, 2
max
}}
i1 i1 j1
elsefor (inti =1; i <=N; i++) {
for (int j =1; j <=N; j++) { maxN, 2N 2N
2 2
sum=sum+i+j;
}}
29
Recursive: Formally
30
Categories of Algorithm Analysis
• Algorithms may be examined under
different situations to correctly determine
their efficiency for accurate comparison.
Best Case Analysis:
• Assumes the input data are arranged in the
most advantageous order for the algorithm.
• Takes the smallest possible set of inputs.
• Causes execution of the fewest number of
statements.
31
• Computes the lower bound of T(n), where
T(n) is the complexity function.
Examples:
For sorting algorithm
If the list is already sorted (data are arranged
in the required order).
For searching algorithm
If the desired item is located at first accessed
position.
32
Worst Case Analysis:
Examples:
For sorting algorithms
If the list is in opposite order.
For searching algorithms
If the desired item is located at the last position or is
missing.
33
Worst Case Analysis:
• Worst case analysis is the most common analysis
because:
It provides the upper bound for all input (even for
bad ones).
Average case analysis is often difficult to determine
and define.
If situations are in their best case, no need to
develop algorithms because data arrangements are
in the best situation.
Best case analysis can not be used to estimate
complexity.
We are interested in the worst case time since it
provides a bound for all input-this is called the “Big-
Oh” estimate.
34
Average Case Analysis:
• Determine the average of the running time overall permutation of
input data.
• Takes an average set of inputs.
• It also assumes random input size.
• It causes average number of executions.
• Computes the optimal bound of T(n) where T(n) is the complexity
function.
• Sometimes average cases are as bad as worst cases and as good
as best cases.
Examples:
For sorting algorithms
While sorting, considering any arrangement (order of input
data).
For searching algorithms
While searching, if the desired item is located at any location
or is missing.
. 35
• The study of algorithms includes:
– How to Design algorithms (Describing
algorithms)
– How to Analyze algorithms (In terms of time
and memory space)
– How to validate algorithms (for any input)
– How to express algorithms (Using programming
language)
– How to test a program (debugging and
maintaining)
• But, in this course more focus will be given
to Design and Analysis of algorithms.
36
Order of Magnitude
• Refers to the rate at which the storage or time
grows as a function of problem size.
37
Asymptotic Notations
• Asymptotic Analysis is concerned with how the
running time of an algorithm increases with the
size of the input in the limit, as the size of the
input increases without bound!
• Asymptotic Analysis makes use of O (Big-Oh) ,
(Big-Omega), (Theta), o (little-o), (little-
omega) - notations in performance analysis and
characterizing the complexity of an algorithm.
• Note: The complexity of an algorithm is a
numerical function of the size of the problem
(instance or input size).
38
Types of Asymptotic Notations
1. Big-Oh Notation
41
Question-2
42
2. Big-Omega ()-Notation (Lower bound)
Example:
Find g(n) such that f(n) = (g(n)) for f(n)=3n+5
g(n) = √n, c=1, k=1.
f(n)=3n+5=(√n)
43
Big-Omega ()-Notation (Lower bound)
44
3. Theta Notation (-Notation) (Optimal bound)
46
4. Little-oh (small-oh) Notation
• Definition: We say f(n)=o(g(n)), if there are positive
constants no and c such that to the right of no, the value of
f(n) lies below c.g(n).
• As n increases, g(n) grows strictly faster than f(n).
• Describes the worst case analysis.
• Denotes an upper bound that is not asymptotically tight.
• Big O-Notation denotes an upper bound that may or may
not be asymptotically tight.
Example:
Find g(n) such that f(n) = o(g(n)) for f(n) = n2
48
Rules to estimate Big Oh of a given
function
• Pick the highest order.
• Ignore the coefficient.
Example:
1. T(n)=3n + 5 O(n)
2. T(n)=3n2+4n+2 O(n2)
51
T(n) Complexity Big-O
Category
functions F(n)
c, c is constant 1 C=O(1)
7n!+2n+n2+1 n! T(n)=O(n!)
T(n)=2*n=2n=O(n).
55
2. for(int i=1; i<=n; i++)
for(int j=1; j<=n; j++)
k++;
T(n)=1*n*n=n2 = O(n2).
56