# Sorting Algorithms: Bubble Sort, Selection Sort and Insertion Sort

Input: An array X[] of n integers

Output: A permutation of input such that X <= X <= X .... <= X[n-2] <= X[n-1]

### Bubble Sort Algorithm

Bubble sort is a simple and inefficient sorting algorithm. It is generally one of the basic algorithms taught in programming to develop an intuition about the working of algorithms. Sometimes It is also referred to as a "Sinking Sort"!

Bubble Sort Idea

If we look at the element in the sorted array: the largest element is at the (n - 1)th index, 2nd largest is at (n - 2)th index, and so on. So one basic idea would be to traverse the array from 0 to n - 1 and place the largest element at the (n - 1)th index. Now again traverse the array from 0 to n - 2 and place the second largest element at the (n - 2)th index and...so on.

• After the end of 1st iteration, the largest element in the array bubbles up towards the (n - 1)th index. Similarly, after the 2nd iteration, the 2nd largest element in the array bubbles up towards the (n - 2)th index. And so on!
• In general, at any ith iteration, we place the ith maximum element at (n - i)th index. This process will go on till the whole array becomes sorted.

Bubble Sort Steps

We run two nested loops:

• At each stage of the outer loop, we place one input value to its correct position in the sorted output. So outer loop will run n times from i = 0 to n - 1.
• After the first ith iteration of the outer loop, i maximum elements will get placed from an index (n - i) to (n - 1). At (i + 1)th iteration of the outer loop, we place (i + 1)th maximum element at (n - i -1)th index. So we need to run an inner loop from n - i - 1 time.

``````for (int i = 0; i < n; i = i + 1)
{
for (int j = 0; j < n - i - 1; j = j + 1)
{
........
}
}
``````
• Now the critical question is: What would be the logic inside the inner loop? The idea would be to start from the first index, comparing adjacent pairs of elements from j = 0 to n - i - 2 and swapping their positions, if they exist in a wrong order.

``````for (int j = 0; j < n - i - 1; j = j + 1)
{
if (X[j] > X[j+1])
{
temp = A[j]
X[j] = X[j+1]
X[j+1] = temp
}
}
``````

Bubble Sort Visualisation Bubble Sort Pseudocode

``````void bubbleSort(int X[], int n)
{
for (int i = 0; i < n; i = i + 1)
{
for (int j = 0; j < n - i - 1; j = j + 1)
{
if (X[j] > X[j + 1])
{
int temp = A[j]
X[j] = X[j + 1]
X[j + 1] = temp
}
}
}
}
``````

### Bubble Sort Analysis

We run two nested loops where comparison and swapping are the key operations. Regardless of the input, comparison operations will execute every time. On another side, swapping depend upon the order of the input, i.e., swapping happens only when comparison X[j] > X[j + 1] is true. So the comparison would be the critical operation that decides the time complexity in the worst case.

For ith iteration of the outer loop, the inner loop will run (n - i - 1) times. So total count of loop iterations = Sum of n - i - 1 from i = 0 to n - 1 = (n - 1) + (n - 2) + . . . + 2 + 1 = n(n - 1)/2 = O(n^2)

The worst-case scenario of the bubble sort

When the input is sorted in decreasing order, the comparison X[j] > X[j + 1] becomes true for each iteration of the outer loop, and the swap operation will execute every time. So, a reverse sorted array will be the scenario of worst-case input in bubble sort.

• Total count of comparison operations = Total count of loop iterations = O(n^2)
• Total count of swapping operations = Total count of comparison operations = O(n^2)
• The time complexity of bubble sort in the worst case = O(n^2) + O(n^2) = O(n^2)

The best-case scenario of the bubble sort

When the input is already sorted in increasing order, then comparison X[j] > X[j+1] becomes false for each iteration of the outer loop, and the swap operation will not execute at all. So, a sorted array is the scenario of the best case input in bubble sort.

• Total count of comparison operations = Total count of loop iterations = O(n^2)
• Total count of swapping operations = 0
• The time complexity of bubble sort in the best case = O(n^2)

In both the worst and best cases, bubble sort runs in O(n^2) time complexity. We use constant extra space, so the space complexity of the bubble sort = O(1).

### Optimized Bubble Sort

Even if the array is sorted, the above algorithm always runs O(n^2) times. From another perspective: if the array gets sorted in the middle of the nested loops, there is no mechanism to stop the loop in the code, and the loop will continue to run unnecessarily. We can use this insight to optimize the bubble sort further. The idea would be to stop the algorithm if the inner loop didn't cause any swap! But the critical question is: how do we do it? Let's think!

If no element gets swapped inside the inner loop, it means the array is sorted, and we can stop the bubble sort algorithms. We can detect such a stage using a boolean variable elementSwapped inside the inner loop.

• Before starting any inner loop, we declare the flag elementSwapped false.
• When any swap happens inside the inner loop, we set the elementSwapped true.
• By the end of the inner loop, if elementSwapped is still false, then it means: no swap happens inside the inner loop, and the given array is already sorted, and we can stop the sorting process. Otherwise, If elementSwapped is true, we should go to the next iteration of the outer loop.

Optimized Bubble Sort Pseudocode

``````void optimizedBubbleSort(int X[], int n)
{
bool elementSwapped
for (int i = 0; i < n; i = i + 1)
{
elementSwapped = false
for (int j = 0; j < n - i - 1; j = j + 1)
{
if (X[j] > X[j + 1])
{
int temp = A[j]
X[j] = X[j + 1]
X[j + 1] = temp
elementSwapped = true
}
}
if (elementSwapped == false)
break
}
}
``````

In the above implementation, if an array is already sorted, then bubble sort makes no swaps, and the algorithm can terminate after a single pass. So, O(n) is the best-case running time for the bubble sort. But there is no improvement in worst-case time complexity which is still O(n^2) for the reverse sorted array.

### Critical ideas to think in Bubble Sort

• What would be the average case time complexity of the bubble sort?
• What is the idea of "stability" in the case of sorting algorithms? is bubble sort a stable sorting algorithm?
• Can we modify the above pseudo-code to sort elements in decreasing order?
• Can we think to implement the bubble sort recursively?
• How can we sort linked lists and strings using bubble sort algorithm?

## Selection Sort Algorithm

Selection Sort Idea

The idea of the selection algorithm is simple: we traverse the whole array to find the smallest element. Once we find it, we swap the smallest element with the first element of the array. Then we again look for the smallest element in the remaining array (excluding the first element) and swap it with the second element. This process goes on till the whole array gets sorted. In general, at the ith step of the algorithm, it maintains two subarrays:

• The sorted subarray X[0…i - 1].
• The unsorted subarray X[i…n - 1].

We build the partially sorted array incrementally by finding the min value from the unsorted part and adding it to the sorted part. At each step of the iteration, our sorted subarray size will grow by 1, and the unsorted subarray size will decrease by 1.

Selection Sort Steps

We need to run two nested loops: an outer loop to place all n elements in the sorted order. For every ith iteration of the outer loop, we run an inner loop to find the smallest element in the remaining array. We run the outer loop from i = 0 to n - 2 and repeat the following steps until the array is sorted.

• We initialize a variable minIndex to store the minimum value index of the unsorted part.
• Now for every index i, we run an inner loop from j = i to n - 1 and find the index of the minimum value in the unsorted part.

``````int minIndex = i
for (int j = i; j < n; j = j + 1)
{
if (X[j] > X[minIndex])
minIndex = j
}
``````
• Now we place the minimum value at ith index by swapping value at i and minIndex i.e. swap (X[minIndex], X[i])

Selection Sort Visualisation Selection Sort Pseudocode

``````void selectionSort(int X[], int n)
{
for (int i = 0; i < n - 1; i = i + 1)
{
int minIndex = i
for (int j = i; j < n; j = j + 1)
{
if (X[j] < X[minIndex])
minIndex = j
}
swap(X[minIndex], X[i])
}
}
``````

### Selection Sort Analysis

We run two nested loops where comparison, swapping, and update of the minIndex are the key operations. Regardless of the input, comparison and swapping operations are going to execute every time, but the update of the minIndex depends on the order of the input. Think!

At each ith iteration of the outer loop, the inner loop will run (n - i) times. So totoal count of loop iterations = Sum of (n - i) from i = 0 to n - 1 = (n) + (n - 1) + (n - 2) + . . . + 2 + 1 = n(n + 1)/2 = O(n^2)

The worst-case scenario of the selection sort

When the input is sorted in decreasing order, comparison if (X[j] < X[minIndex]) becomes true every time, and the value of minIndex will get updated every time. So, a reverse sorted array is a worst-case input for selection sort.

• Toral count of comparison operations = Total count of loop iterations = O(n^2)
• We are performing one swap on each iteration of the outer loop. Total count of swapping operations = n * O(1) = O(n)
• Total update operation of the minIndex = Total count of loop operations = O(n^2)
• So, the time complexity of selection sort in the worst case = O(n^2) +O(n) + O(n^2) = O(n^2)

The best-case scenario of the selection sort

When the input is already sorted, then comparison if (X[j] < X[minIndex]) becomes false every time, and the value of minIndex will not get updated every time. So, a reverse sorted array is a best-case input for selection sort.

• Total count of comparison operations = Total count of loop iterations = O(n^2)
• Total count of swap operations = O(n)
• Total update operation of the minIndex = O(1)
• So the time complexity of selection sort in the best case = O(n^2) +O(n) + O(1) = O(n^2)

So in both the worst and best cases, selection sort runs in O(n^2) time complexity. We use constant extra space, so space complexity = O(1).

### Critical ideas to think in Selection Sort

• What would be the average case time complexity of the selection sort?
• is selection sort a stable sorting algorithm?
• The above pseudo-code sort the array in increasing order. How would you modify the above pseudocode to sort the elements in decreasing order?
• How do we implement the selection sort recursively?
• How can we sort linked lists and strings using selection sort?

### Insertion Sort Algorithm

Insertion Sort Idea

It works the way you sort a hand of playing cards:

• We start with an empty left hand and all the remaining cards down on the table.
• Then we remove one card [key] at a time from the table [unsorted array] and insert it into the correct position in the left hand [sorted array].
• To find the correct position for the card, we compare it with each card already in hand, from right to left. The core idea is: If the first few elements of an array are sorted, we can easily insert an element at the proper place in the sorted part. At any ith step of the algorithm, it maintains two subarrays in a given array.

• The sorted subarray X[0…i - 1].
• The unsorted subarray X[i…n-1].

We incrementally build the partially sorted array at each iteration step by choosing the first value X[i] from the unsorted part and inserting it into the sorted part X[0…i-1]. At each iteration step, our sorted part size will increase by 1, and the unsorted part size will decrease by 1. Now the critical question is: How do we implement this idea? Let's think!

Insertion Sort Steps

We run a loop from i = 1 to n -1 and repeat the following steps until the whole array is sorted. We are starting from i = 1 because i = 0 is a case of a single element that is already sorted.

• We initialize a variable currValue with X[i]. Now we need to place currValue at the correct position of the partially sorted array X[0…i - 1].
• For this, we run an inner loop from j = i - 1 to 0 and find the index to insert currValue in the partially sorted array X[0…i - 1]. During this procedure, we move elements greater than the currValue to one position ahead of their current position. Note: The inner loop will run till j >= 0 && X[j] > currValue. Think!

``````int currValue = X[i]
int j = i - 1
while (j >= 0 && X[j] > currValue)
{
X[j + 1] = X[j]
j = j - 1
}
``````
• After the inner loop, j + 1 will be the correct position of currValue in the partially sorted array. So we Insert currValue at (j + 1)th index i.e. X[j + 1] = key. Think!
• Now we move to the next iteration of the outer loop and continue a similar process to insert the remaining element from the unsorted port to the partially sorted part.
• After the completion of the outer loop, array X[] contains the sorted output. Image Source : CLRS

Insertion Sort Pseudocode

``````void insertionSort(int X[], int n)
{
for (int i = 1; i < n; i = i + 1)
{
int currValue = X[i]
int j = i - 1
while (j >= 0 && X[j] > currValue)
{
X[j + 1] = X[j]
j = j - 1
}
X[j + 1] = currValue
}
}
``````

### Insertion Sort Analysis

The best-case scenario of the insertion sort

This is the situation of an already sorted array, where the outer loop will run n - 1 time, and the inner loop will run only once. Why? Because every time we enter into the loop, the condition X[j] > currValue will be false, and we exit the inner loop. In another way: the ith value is already there at the correct position, and we need to perform only one comparison at each iteration of the outer loop.

• Total comparison operations = n - 1 = O(n)
• Total shifting operations = 0 = O(1)
• So, the best case time complexity of insertion sort = O(n) + O(1) = O(n)

The worst-case scenario of the insertion sort

This is the situation of a reverse sorted array, where the outer loop will run n - 1 time, and at each iteration of the outer loop, the inner loops run i times. Why? Because the condition X[j] > currValue will be true for every value from j = i - 1 to 0. In another way: to insert the ith value on its correct position, we shift all the values from j = i - 1 to 0 to one left.

• At ith iteration of the outer loop: count of comparison operations = i, count of shifting operations = i
• Total comparison operations = 1 + 2 + 3 + ….+ n - 2 + n - 1 = n(n - 1)/2 = O(n²)
• Total shifting operations = 1 + 2 + 3 + ….+ n - 2 + n - 1 = n(n - 1)/2 = O(n²)
• So the worst-case time complexity of insertion sort = O(n²) + O(n²) = O(n²)

### Critical ideas to think in Insertion Sort

• Insertion sort is one of the fastest sorting algorithms for small input sizes.
• Insertion sort is one of the fastest sorting algorithms for the partially or almost sorted input data, i.e., the time complexity is O(kn), where each element is no more than k places away from its sorted position. Think!
• Insertion sort is a stable sorting algorithm.
• We can optimize insertion sort further using binary search.
• Insertion sort is an efficient sorting algorithm than selection and bubble sort?
• The average case time complexity of the insertion sort is closer to the worst-case time complexity i.e. O(n²).
• The above pseudo-code sort the array in increasing order. How do we modify the above pseudo-code to sort the elements in decreasing order?
• How can we implement the Insertion sort recursively?
• How can we sort linked lists and strings using selection sort?

### CritIcal concepts to explore further in sorting

• Concept of Stability in sorting algorithms
• What is comparison-based sorting? Explore the design, code, and analysis of merge sort, quicksort, heap sort, shell sort, BST sort, etc.
• Comparison of various sorting algorithms
• Linear time sorting algorithms: Counting sort, Radix sort, Bucket sort
• Searching in a sorted array: Binary search, Exponential Search
• Problem-solving using sorting
• Real-life applications of sorting

### Sorting based coding problems to practice

• Sort an array of 0s, 1s, and 2s, Sort an array in a waveform
• Sort an array according to the order defined by another array
• Sort the even positioned elements in the ascending order and the odd positioned elements in the descending order.
• Count swaps required to sort an array using Insertion Sort
• Merge two sorted arrays, find the intersection of two sorted array
• Binary Insertion Sort, Recursive Insertion Sort

### Content references

• Algorithms by CLRS
• The Algorithm Design Manual by Steven Skiena

Enjoy learning, Enjoy algorithms!