Sorting Algorithms
Sorting Algorithms
Sorting Algorithms
• Efficient sorting is important for optimizing the use of other algorithms (such
as search and merge algorithms) which require input data to be in sorted lists
Why Sorting?
“When in doubt, sort” – one of the principles of algorithm design. Sorting used as a subroutine
in many of the algorithms:
A lower bound for sorting W(n log n) is used to prove lower bounds of other problems
Merge sort
Worst-case running time Q(n log n); but requires additional memory
The Idea of the insertion sort is similar to the Idea of sorting the Playing cards .
Using linear search, find the location in the sorted portion where the 1st element of the
unsorted portion should be inserted
Move all the elements after the insertion location up one position to make space for the
new element
Way of working
1. Selecting .
• Mathematical applications : in the search for greater value, or the smallest value.
SELECTION SORT
Advantages
Easy to write
The primary disadvantage of selection sort is its poor efficiency when dealing with a
huge list of items.
BUBBLE SORT
“ A procedure for sorting a set of items that begins by sequencing the first and second items,
then second and third and so on, until the end of the set is reached. And repeat this process
until all items are correctly sequenced.”
Advantages
It is easy to implement.
Disadvantages
It does not deal well with a list containing a huge number of items.
Partition / Divide
Since each element ultimately ends up in the correct position, the algorithm
correctly sorts. But how long does it take?.On this basis it is divided into
following three cases.
1. Best Case
2. Worst Case
3. Average Case
Best Case for Quick Sort
The best case for divide-and-conquer algorithms comes when we split the input as evenly as
possible. Thus in the best case, each sub problem is of size n/2. The partition step on each sub
problem is linear in its size. the total efficiency(time taken) in the best case is O(nlog2n).
Worst Case for Quicksort
Suppose instead our pivot element splits the array as unequally as possible. Thus instead of n/2
elements in the smaller half, we get zero, meaning that the pivot element is the biggest or
smallest element in the array.
The Average Case for Quicksort
Suppose we pick the pivot element at random in an array of n keys Half the time, the pivot
element will be from the centre half of the sorted array. Whenever the pivot element is from
positions n/4 to 3n/4, the larger remaining sub-array contains at most 3n/4 elements.
Merge Sort