0% found this document useful (0 votes)
18 views

Lesson 1p

The document compares different algorithms for sorting data and analyzing their efficiency using Big-O notation. It provides examples of common time complexities like O(1), O(N), O(N2), O(N log N), and O(2N) and explains what types of algorithms achieve each complexity. Examples discussed include searching a phone book with linear vs binary search, and sorting algorithms like selection, bubble, and insertion sort. The document emphasizes that Big-O notation allows comparison of algorithms independent of programming language or hardware used.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

Lesson 1p

The document compares different algorithms for sorting data and analyzing their efficiency using Big-O notation. It provides examples of common time complexities like O(1), O(N), O(N2), O(N log N), and O(2N) and explains what types of algorithms achieve each complexity. Examples discussed include searching a phone book with linear vs binary search, and sorting algorithms like selection, bubble, and insertion sort. The document emphasizes that Big-O notation allows comparison of algorithms independent of programming language or hardware used.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Comparison of Algorithms Big-O Notation

l How do we compare the efficiency of different l The best way is to compare algorithms by the
algorithms? amount of work done in a critical loop, as a
l Comparing execution time: Too many function of the number of input elements (N)
assumptions, varies greatly between different l Big-O: A notation expressing execution time
computers (complexity) as the term in a function that
l Compare number of instructions: Varies greatly increases most rapidly relative to N
due to different languages, compilers, l Consider the order of magnitude of the
programming styles... algorithm

Common Orders of Magnitude


Common Orders of Magnitude
(cont.)
l O(1): Constant or bounded time; not affected by l O(N2): Quadratic time; typically apply a linear
N at all algorithm N times, or process every element
l O(log2N): Logarithmic time; each step of the with every other element
algorithm cuts the amount of work left in half l O(N3): Cubic time; naive multiplication of two
l O(N): Linear time; each element of the input is NxN matrices, or process every element in a
processed three-dimensional matrix
l O(N log2N): N log2N time; apply a logarithmic l O(2N): Exponential time; computation increases
algorithm N times or vice versa dramatically with input size
What About Other Factors? Elephants and Goldfish
l Consider f(N) = 2N4 + 100N2 + 10N + 50 l Think about buying elephants and goldfish and
l We can ignore 100N2 + 10N + 50 because 2N4 comparing different pet suppliers
grows so quickly l The price of the goldfish is trivial compared to
l Similarly, the 2 in 2N4 does not greatly influence the cost of the elephants
the growth l Similarly, the growth from 100N2 + 10N + 50 is
l The final order of magnitude is O(N4) trivial compared to 2N4
l The other factors may be useful when l The smaller factors are essentially noise
comparing two very similar algorithms

Example: Phone Book Search


Example: Phone Book Search
(cont.)
l Goal: Given a name, find the matching phone Algorithm 2: Since the phone book is sorted, we
number in the phone book can use a more efficient search
l Algorithm 1: Linear search through the phone 1) Check the name in the middle of the book
book until the name is found 2) If the target name is less than the middle
l Best case: O(1) (it’s the first name in the book) name, search the first half of the book
l Worst case: O(N) (it’s the final name) 3) If the target name is greater, search the last
l Average case: The name is near the middle, half
requiring N/2 steps, which is O(N) 4) Continue until the name is found
Example: Phone Book Search Example: Phone Book Search
(cont.) (cont.)
Algorithm 2 Characteristics: Which algorithm is better?
l Each step reduces the search space by half l For very small N, algorithm may be faster

l Best case: O(1) (we find the name immediately) l For target names in the very beginning of the

l Worst case: O(log N) (we find the name after phone book, algorithm 1 can be faster
2
cutting the space in half several times) l Algorithm 2 will be faster in every other case

l Average case: O(log N) (it takes a few steps to l Success of algorithm 2 relies the fact that the
2
find the name) phone book is sorted
- Data structures matter!

Sorting Revisited Sorting Efficiency


l Sorting is a very common and useful operation l Worst Case: The data is in reverse order
l Efficient sorting algorithms can have large l Average Case: Random data, may be
savings for many applications somewhat sorted already
l The algorithms are evaluated on: l Best Case: The array is already sorted
- The number of comparisons made l Typically, average and worst case performance
- The number of times data is moved are similar, if not identical
- The amount of additional memory used l For many algorithms, the best case is also the
same as the other cases
Straight Selection Sort Analyzing Selection Sort
1) Set “current” to the first index of the array l A very simple, easy-to-understand algorithm
2) Find the smallest value in the array l N iterations are performed
3) Swap the smallest value with the value in current l Iteration I checks N – I items to find the next
4) Increment current and repeat steps 2–4 until the end
of the array is reached
smallest value
l There are N * (N – 1)/2 comparisons total
l Therefore, selection sort is O(N2)
l Even in the best case, it’s still O(N2)

Figure 12.1 Example of straight selection sort (sorted elements are shaded)

Bubble Sort Bubble Sort


1) Set “current” to the first index of the array l The name comes from how smaller elements
2) For every index from the end of the list to 1, swap “bubble up” to the top of the array
adjacent pairs of elements that are out of order l The inner loop compares values [index] <
3) Increment current and repeat steps 2–3
4) Stop when current is at the end of the array
values [index-1], and swaps the two values if it
evaluates to true
l The smallest value is brought to the front of the
unsorted portion of the array during iteration

Figure 12.3 Example of bubble sort (sorted elements are shaded)


Insertion Sort Analyzing Bubble Sort
l Acts like inserting elements into a sorted array, l Takes N-1 iterations, because the last iteration
including moving elements down if necessary puts two values in order
l Uses swapping (like Bubble Sort) to find the l Each iteration I performs N-I comparisons
correct position of the next item l Bubble sort is therefore O(N2)
l It may perform several swaps per iteration
l Is the best case better? An already-sorted array
needs only 1 iteration, so the base case is O(N)

Figure 12.5 Example of the insertion sort algorithm

Analyzing Insertion Sort O(N log2N) Sorts


l O(N2), like the previous sorts l Sorting a whole array is O(N2) with those sorts
l Best Case: O(N), since only one comparison is l Splitting the array in half, sorting it, and then
needed and no data is moved merging the two arrays is (N/2)2 + (N/2)2
l O(N2) is not good enough when sorting large l This “divide-and-conquer” approach can then
sets of data! be applied to each half, giving O(N log2N) sort

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy