Big O

Download as pdf or txt
Download as pdf or txt
You are on page 1of 53

Complexity

Big O-Notation

Complexity – Big O-Notation


Efficiency
• Choice of data structure or algorithm can make the difference
between a program running in a few seconds or many days.

• A solution is said to be efficient if it solves the problem within its


resource constraints.
• Space
• Time

• The cost of a solution is the amount of resources that the solution


consumes.
Complexity – Big O-Notation
Algorithm Efficiency (1)
There are often many approaches (algorithms) to solve a problem. How
do we choose between them?

At the heart of computer program design are two (sometimes conflicting)


goals.

1. To design an algorithm that is easy to understand, code, debug.

2. To design an algorithm that makes efficient use of the computer’s resources.

Complexity – Big O-Notation 3


Algorithm Efficiency (2)
Goal (1) is the concern of Software Engineering.

Goal (2) is the concern of data structures and algorithm analysis.

When goal (2) is important, how do we measure an algorithm’s cost?

Complexity – Big O-Notation 4


How to Measure Efficiency?
1. Empirical comparison (run programs)
2. Asymptotic Algorithm Analysis
Critical resources:

Factors affecting running time:

- For most algorithms, running time depends on “size” of the input.


- Running time is expressed as T(n) for some function T on input
size n.

Complexity – Big O-Notation 5


Growth Rate Graph

Complexity – Big O-Notation 6


Best, Worst, Average Cases
Not all inputs of a given size take the same time to run.
Sequential search for K in an array of n integers:
• Begin at first element in array and look at each element in turn until K is
found

Best case:
Worst case:
Average case:

Complexity – Big O-Notation 7


Which Analysis to Use?
While average time appears to be the fairest measure, it may be difficult
to determine.

When is the worst case time important?

Complexity – Big O-Notation 8


Algorithm run times
Once we have chosen a data structure to store both the objects
and the relationships, we must implement the queries or
operations as algorithms
• The Abstract Data Type will be implemented as a class
• The data structure will be defined by the member variables
• The member functions will implement the algorithms

The question is, how do we determine the efficiency of the


algorithms?

Complexity – Big O-Notation


Algorithm properties (1)
An algorithm is endowed with the following properties:

1. Finiteness: An algorithm must terminate after a finite number of steps.

2. Definiteness: The steps of the algorithm must be precisely defined or unambiguously specified.

3. Generality: An algorithm must be generic enough to solve all problems of a particular class.

4. Effectiveness: the operations of the algorithm must be basic enough to be put down on pencil and paper. They
should not be too complex to warrant writing another algorithm for the operation.

5. Input-Output: The algorithm must have certain initial and precise inputs, and outputs that may be generated
both at its intermediate and final steps.
Complexity – Big O-Notation
Algorithm properties (2)

Different Approaches to Design an Algorithm:

An algorithm does not enforce a language or mode for its expression but only demands adherence to its properties.

• Practical Algorithm Design Issues:

• 1. To save time (Time Complexity): A program that runs faster is a better program.

• 2. To save space (Space Complexity): A program that saves space over a competing program is considerable desirable.

Complexity – Big O-Notation


Algorithm properties (3)

• 1. To save time (Time Complexity): The time complexity of an algorithm or a program is a function of the running
time of the algorithm or a program. In other words, it is the amount of computer time it needs to run to completion.

The time complexity of an algorithm can be computed either by an empirical or theoretical approach. The empirical or
posteriori testing approach calls for implementing the complete algorithms and executing them on a computer for various
instances of the problem. The time taken by the execution of the programs for various instances of the problem are noted and
compared. The algorithm whose implementation yields the least time is considered as the best among the candidate algorithmic
solutions.

• 2. To save space (Space Complexity): The space complexity of an algorithm or program is a function of the space
needed by the algorithm or program to run to completion.

Complexity – Big O-Notation


Asymptotic Analysis: Big-oh
Definition: For T(n) a non-negatively valued function, T(n) is in the set
O(f(n)) if there exist two positive constants c and n0 such that T(n)
<= cf(n) for all n > n0.

Use: The algorithm is in O(n2) in [best, average, worst] case.

Meaning: For all data sets big enough (i.e., n>n0), the algorithm always
executes in less than cf(n) steps in [best, average, worst] case.

Complexity – Big O-Notation 13


What is Big O (1)
• Big O notation is a mathematical notation that describes the limiting behavior
of a function when the argument tends towards a particular value or infinity.

• We use Big O to describe the performance of an algorithm.

• Determine if a given algorithm is scalable or not.

Complexity – Big O-Notation


What is Big O (2)
O(n)

• We will learn in this lecture certain operations that can be more or


less costly depending on what data structure we use

Complexity – Big O-Notation


What is Big O (3)
• For example:
• Array :
1 2 3 4 5

array[index]

• Accessing an array element by its index is super fast but arrays have a fixed
length and if you want to constantly add or remove items from them they
have to get resized and this will get costly as the size of our input grows
very large.

Complexity – Big O-Notation


What is Big O (4)
• LINKED LIST

2 4

• So if that’s what we need to do, then we have to use another data


structure called a Linked List
• These data structures can grow or shrink very quickly but accessing a linked
list element by its index is slow
• So that’s why you need to learn about the Big O notation first before we
can talk about various data structures

Complexity – Big O-Notation


Big O-notation (1)
• Also big companies like Google Microsoft and Amazon always ask you
about Big O they want to know if you really understand how scalable an
algorithm is

• And finally knowing Big L will make you a better developer or software
engineer.

Complexity – Big O-Notation


Big O-notation (2)
• The time complexity is represented using a indicator called the O-notation.
The capital O is followed by a bracket that either has the number 1 or the
symbol n with some mathematical variation.

• Big-oh notation indicates an upper bound.

• O(n) = The algorithm scales to the order of n.


O(n3) = The algorithm scales to the order of n3.

Complexity – Big O-Notation


Big O-notation (3)

Example: If T(n) = 3n2 then T(n) is in O(n2).

Look for the tightest upper bound:


While T(n) = 3n2 is in O(n3), we prefer O(n2).

Complexity – Big O-Notation 20


Big O-notation (4)
Example 1: Finding value X in an array (average cost).

Hence, T(n) is in O(n).

Example 2: Suppose T(n) = c1n2 + c2n, where c1 and c2 are positive.

Therefore, T(n) is in O(n2) by definition.

Example 3: T(n) = c. Then T(n) is in O(1).

Complexity – Big O-Notation 21


A Common Misunderstanding
“The best case for my algorithm is n=1 because that is the fastest.”
WRONG!

Big-oh refers to a growth rate as n grows to .

Best case is defined for the input of size n that is cheapest among all
inputs of size n.

Complexity – Big O-Notation 22


Cases of Big O-notation O(1) (1)
public class TestMain {
public void printArray(int[] Ar) {
System.out.println(Ar[0]);
}
}

• This method takes an array of integers and prints the first item on a console.
• It doesn’t matter about how big is the array . We can have an array with 1 or 1
million items. All you are doing here is printing the first item.

Complexity – Big O-Notation


Cases of Big O-notation O(1) (2)
• This method has one single instruction and take a constant amount of time to
run.
• We don’t care about the exact execute computation time in milliseconds
because this can be different from one machine to another.
• This method run in a constant time with a complexity time O(1).
public class TestMain {
public void printArray(int[] Ar) {
// O(1)
System.out.println(Ar[0]);
}
}
Complexity – Big O-Notation
Cases of Big O-notation O(1) (3)
public class TestMain {
public void printArray(int[] Ar) {
This method has 2 single operations and take a System.out.println(Ar[0]);
constant amount of time to run. System.out.println(Ar[0]);
}
}
• This method run in a constant time with a complexity time O(1).

public class TestMain { public class TestMain {


public void printArray(int[] Ar) { public void printArray(int[] Ar) {
We put O(1) to represent a constant time //O(1)
//O(2)
System.out.println(Ar[0]); System.out.println(Ar[0]);
System.out.println(Ar[0]); System.out.println(Ar[0]);
} }
} }

Complexity – Big O-Notation


Cases of Big O-notation O(n) (1)
• Let’s consider a more complex algorithm using a loop.
• If we have 1 item, so we have one print operation
• If we have million items, we have million print operations.

public class TestMain { public class TestMain {


public void printArray(int[] Ar) { //O(n)
for (int i=0; i < Ar.length; i++) public void printArray(int[] Ar) {
System.out.println(Ar[i]); for (int i=0; i < Ar.length; i++)
} System.out.println(Ar[i]);
} }
}
Complexity – Big O-Notation
Cases of Big O-notation O(n) (2)
• As the size of array grows, the complexity running time grows linearly.

public class TestMain { public class TestMain {


public void printArray(int[] Ar) { //O(n)
for (int i=0; i < Ar.length; i++) public void printArray(int[] Ar) {
System.out.println(Ar[i]); for (int i=0; i < Ar.length; i++)
} System.out.println(Ar[i]);
} }
}

Complexity – Big O-Notation


Cases of Big O-notation O(n) (3)
• Let’s consider an algorithm with one statement before and after the loop
• So the running time complexity is O(1+n+1). [O(2+n)]
• When we use Big O notation, 2 has no significance comparing to n. So, the
cost of the algorithm still increasing linearly in proportion to data input n.
It can be written as O(n).
public class TestMain { public class TestMain {
public void printArray(int[] Ar) { public void printArray(int[] Ar) {
System.out.println(); //O(1+n+1) or [O(2+n)]  //O(n)
for (int i=0; i < Ar.length; i++) System.out.println(); //O(1)
System.out.println(Ar[i]); for (int i=0; i < Ar.length; i++) //O(n)
System.out.println(); System.out.println(Ar[i]);
} System.out.println(); //O(1)
} }
}
Complexity – Big O-Notation
Cases of Big O-notation O(n) (4)
• IF we have two loops, so the running time complexity is O(n+n). [O(2n)]
• When we use Big O notation, n or 2n is same because all we need here is
an approximation of the growth of this algorithm relative to input size. So
n or 2n still represents a linear growth.

public class TestMain { public class TestMain {


public void printArray(int[] Ar) { public void printArray(int[] Ar) {
for (int i=0; i < Ar.length; i++) //O(n+n) or [O(2n)]  O(n)
System.out.println(Ar[i]); for (int i=0; i < Ar.length; i++) //O(n)
System.out.println(Ar[i]);
for (int i=0; i < Ar.length; i++)
System.out.println(Ar[i]); for (int i=0; i < Ar.length; i++) //O(n)
} System.out.println(Ar[i]);
} }
}
Complexity – Big O-Notation
Cases of Big O-notation O(n) (5)
• IF we have two loops, so the running time complexity is O(n+m).
• Same here, we can drop m, and write it as O(n) because the running time
complexity growth linearly.
public class TestMain {
public class TestMain { public void printArray(int[] Ar, String[] names) {
public void printArray(int[] Ar, String[] names) { //O(n+m)  O(n)
for (int i=0; i < Ar.length; i++) for (int i=0; i < Ar.length; i++) //O(n)
System.out.println(Ar[i]); System.out.println(Ar[i]);

for (int i=0; i < names.length; i++) for (int i=0; i < names.length; i++) //O(m)
System.out.println(names[i]); System.out.println(names[i]);
} }
} }

Complexity – Big O-Notation


Cases of Big O-notation O(n2) (1)
• IF we have nested loops, so the running time complexity is O(n2).
• In the example, we are printing all the combinations. The running time
complexity growth is quadratic.

public class TestMain { public class TestMain {


public void printArray(int[] Ar) { public void printArray(int[] Ar) {
//O(n*n)  O(n2)
for (int i=0; i < Ar.length; i++)
for (int j=0; j < Ar.length; j++) for (int i=0; i < Ar.length; i++) //O(n)
System.out.println(Ar[i] + “, “+Ar[j]); for (int j=0; j < Ar.length; j++) //O(n)
} System.out.println(Ar[i] + “, “+Ar[j]);
} }
}

Complexity – Big O-Notation


Cases of Big O-notation O(n2) (2)

• Algorithms running in O(n2) complexity is slower than an algorithm with


complexity O(n).
• Obviously, this depends of the size.

Complexity – Big O-Notation


Cases of Big O-notation O(n2) (3)
• IF we have a loop and two nested loops, so the running time complexity is
O(n + n2).
• n2 is largely greater than n. All we need is an approximation. So the
running time complexity growth is O(n2).
public class TestMain { public class TestMain {
public void printArray(int[] Ar) { public void printArray(int[] Ar) {
for (int i=0; i < Ar.length; i++) //O(n + n2)  O(n2)
System.out.println(Ar[i]) for (int i=0; i < Ar.length; i++) O(n)
System.out.println(Ar[i])
for (int i=0; i < Ar.length; i++)
for (int j=0; j < Ar.length; j++) for (int i=0; i < Ar.length; i++) //O(n)
System.out.println(Ar[i] + “, “+Ar[j]); for (int j=0; j < Ar.length; j++) //O(n)
} System.out.println(Ar[i] + “, “+Ar[j]);
} }
}
Complexity – Big O-Notation
Cases of Big O-notation O(n3)
• IF we have three nested loop, so the running time complexity is O(n3) .
• This algorithm goes larger slower than an algorithm with complexity O(n2) .

public class TestMain { public class TestMain {


public void printArray(int[] Ar) { public void printArray(int[] Ar) {
// O(n*n*n)  O(n3)
for (int i=0; i < Ar.length; i++) for (int i=0; i < Ar.length; i++) //O(n)
for (int j=0; j < Ar.length; j++) for (int j=0; j < Ar.length; j++) //O(n)
for (int k=0; k < Ar.length; k++) for (int k=0; k < Ar.length; k++) //O(n)
System.out.println(Ar[i] + “, “+Ar[j] + “, “+Ar[k]); System.out.println(Ar[i] + “, “+Ar[j] + “, “+Ar[k]);
} }
} }

Complexity – Big O-Notation


Cases of Big O-notation O(log n) – Logarithmic (1)
• Linear curve grows at the same rate while logarithmic curve slow down at
certain points.

• Algorithm with logarithmic complexity is more


efficient and more scalable than an algorithm
that grow with linear or quadratic complexity time.

Complexity – Big O-Notation


Cases of Big O-notation O(log n) – Logarithmic (2)
• Example, let’s consider a sorted array. We want to find number 10
• Apply a Linear Search by using a loop to pass all the position until finding
the number 10.

Linear Search

1 2 3 4 5 6 7 8 9 10

• The running time of this algorithm increases linearly (Compexity is O(n)


where n is the size of the array)
• How many elements are examined in worst case?

Complexity – Big O-Notation


Cases of Big O-notation O(log n) – Logarithmic (3)
• Another search method is based on Binary Search. This algorithm runs in a
logarithmic time which is faster than in Linear Search.
Binary Search

1 2 3 4 5 6 7 8 9 10

1 2 3 4 5 6 7 8 9 10

1 2 3 4 5 6 7 8 9 10

6 7 8 9 10

64 items, 6 comparisons 1 million items, 19 comparisons


Cases of Big O-notation O(log n) – Logarithmic (4)
• Don’t need to explore all the items
• Its complexity running time is of order O(log n)
• Algorithm with logarithmic time is more scalable that algorithm with linear
or quadratic time

Complexity – Big O-Notation


Cases of Big O-notation O(2n) - Exponential
• Exponential curve is opposite to logarithmic curve.
• Algorithm with exponential time growth slower than algorithm with
logarithmic, linear or quadratic time
• Obviously, Algorithm with exponential curve is not scalable.

Complexity – Big O-Notation


Big O-notation O
Constant O(1)
Logarithmic O(log n) Complexity Growth Rate

Linear O(n) - O (n log n)


Quadratic O(n2)
Exponential O(2n)
O(2!)

NB.
O(1) is the most efficient complexity for an algorithm. O(1) algorithms are a programmer’s dream, but they usually take up plenty of memory

O(n): The complexity is exactly the same as the size of data

O(2n), O(n!) : There are called exponential and factorial complexities. Ideally you should never write algorithms that perform at these
complexities since they seem to take eternity. If your measurement shows these complexities, then you have written something inefficient
and the code must be refactored.
Complexity – Big O-Notation
Time Complexity Examples (1)
Example 1: a = b;
This assignment takes constant time, so it is O(1).

Example 2:
sum = 0;
for (i=1; i<=n; i++)
sum += n;

Complexity – Big O-Notation 41


Time Complexity Examples (2)
Example 3:
sum = 0;
for (j=1; j<=n; j++)
for (i=1; i<=j; i++)
sum++;
for (k=0; k<n; k++)
A[k] = k;

Complexity – Big O-Notation 42


Time Complexity Examples (3)
Example 4:
sum1 = 0;
for (i=1; i<=n; i++)
for (j=1; j<=n; j++)
sum1++;
sum2 = 0;
for (i=1; i<=n; i++)
for (j=1; j<=i; j++)
sum2++;

Complexity – Big O-Notation 43


Time Complexity Examples (4)
Example 5:
sum1 = 0;
for (k=1; k<=n; k*=2)
for (j=1; j<=n; j++)
sum1++;
sum2 = 0;
for (k=1; k<=n; k*=2)
for (j=1; j<=k; j++)
sum2++;

Complexity – Big O-Notation 44


Time Complexity Examples (5)

Complexity – Big O-Notation 45


Time Complexity Examples (6)

Complexity – Big O-Notation 46


Time Complexity Examples (7)

Complexity – Big O-Notation 47


Space Complexity (1)
Space complexity can also be analyzed with asymptotic complexity
analysis.

Time: Algorithm
Space: Data Structure

Complexity – Big O-Notation 48


Space Complexity (2)
• Saving Time
• Saving Memory

• Optimizing an algorithm by using less space, make it scalable.


• When we have limited space, we make applications to scale with the
available memory (mobile app).
• We are going to describe how many space an algorithm it needs to
make algorithms scalable (using big O)
• We have to make a tradeoff between saving time and saving space
Complexity – Big O-Notation
Memory usage versus run times (1)
As well as determining run times, we are also interested in
memory usage
In general, there is an interesting relationship between memory
and time efficiency

For a data structure/algorithm:


• Improving the run time usually
requires more memory
• Reducing the required memory
usually requires more run time

Complexity – Big O-Notation


Memory usage versus run times (2)
Warning: programmers often mistake this to suggest that given
any solution to a problem, any solution which may be faster
must require more memory

This guideline not true in general: there may be different data


structures and/or algorithms which are both faster and require
less memory
• This requires thought and research

Complexity – Big O-Notation


Cases of Space Complexity (1)
• This method allocate additional memory
• The variable i is independent of the parameter (names of type String).
if we have 10 or one million items, the allocation in memory is O(1).
public class TestMain {
public class TestMain {
public void printArray(iString[] names) {
public void printArray(iString[] names) {
//O(1) space
for (int i=0; i < names.length; i++)
for (int i=0; i < names.length; i++)
System.out.println(“Hi ”+names[i]);
System.out.println(“Hi ”+names[i]);
}
}
}
}

Complexity – Big O-Notation


Cases of Space Complexity (2)
• We create a copy array. The length of this array is equal to the length
of input array (names).
• If the input array has 1000 items, the copy array has 1000 items.
• The space complexity of this method is O(n). The more items we have
in the input array, the more space we have in the memory.
public class TestMain {
public void printArray(iString[] names) {
//O(n) space
String[] copy=new String[names.length];
for (int i=0; i < names.length; i++)
System.out.println(“Hi ”+nmes[i]);
}
Complexity – Big O-Notation
}

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy