Why do we study the design, code, and analysis of the sorting algorithms? Here are some critical reasons:
This blog will analyze and compare various sorting algorithms based on different parameters like time complexity, space complexity, in-place, stability, online vs. offline, problem-solving approaches, etc.
Comparison based sorting algorithms
In comparison-based sorting algorithms, we compare elements to determine the order of elements in the final sorted output. All comparison-based sorting algorithms have a complexity lower bound of nlogn. (Think!) We have already seen that any comparison-based sorting algorithm must take O(n log n) time to sort an array of n elements in the worst case.
As we have seen, the worst-case time complexities of the above sorting algorithms can be categorized into two parts: O(n^2) and O(nlogn).
In addition to the comparison operations, we also perform other types of operations in these sorting algorithms. But the count of these operations would always be less than the count of comparison operations. That's why comparison operation is the deciding factor of the time complexity.
Liner time sorting algorithm
There are sorting algorithms that run faster than O(nlogn) time complexity, but they require special assumptions about the input sequence to determine the sorted order of elements. These sorting algorithms use operations other than comparisons to determine the sorted order and work in O(n) time complexity. So, O(nlogn) lower bound does not apply to these sorting algorithms.
Examples of sorting algorithms that run in linear time are counting sort, radix sort, bucket sort, etc. Counting sort and radix sort assume that the input consists of integers in a small range. At the same time, bucket sort assumes that the input is generated by a random process that distributes elements uniformly over the interval.
Special conditions with linear time sorting algorithms
Here is the comparison of time and space complexities of some popular sorting algorithms:
A sorting algorithm is In-place if it does not use extra space to manipulate the input but may require a small though nonconstant extra space for its operation. Or we can say a sorting algorithm sorts in place if only a constant number of input array elements are ever stored outside the array.
A sorting algorithm is stable if it does not change the order of elements with the same value.
The algorithm that accepts a new element while the sorting process is going on is called the online sorting algorithm. An online algorithm can process its input piece-by-piece in a serial order. In other words, online sorting algorithms can sort the data not without having the entire input available from the beginning.
For example, insertion sort considers one input element per iteration and produces a partially sorted solution without considering future elements. So maintaining a sorted list in the insertion sort, we can place each input item in its proper place as we receive the input. So insertion sort is an online sorting algorithm.
In contrast, an offline algorithm needs the complete input data in the memory from the beginning, or it requires all items to be in memory before sorting can begin. For example, The selection sort algorithm sorts an array by repeatedly finding the minimum element from the unsorted part and putting it at the beginning. Which always requires access to the entire input, so it is an offline algorithm.
Enjoy learning, Enjoy sorting!
Merge sort is one of the fastest comparison-based sorting algorithms, which works on the principle of the divide and conquer approach. The worst and best case time complexity of merge sort is O(nlogn), and space complexity is O(n). It is also the best algorithm for sorting linked lists.
Learning analysis of recursion is critical to understand the time complexity analysis of recursive algorithms. We will discuss these concepts related to the recursion analysis: Recurrence relations of recursive algorithms, steps to analyze the time complexity of recursion, Recursion tree method, and master theorem to analyze divide and conquer algorithms.
There are several loop patterns in programming like a loop running constant time, a loop running n times, a loop growing exponentially, a loop running based on the specific condition, consecutive single loops, two nested loops, three nested loops, consecutive single loop with nested loops, etc. So for designing a better algorithm or optimizing the code, we should learn to analyze the time complexity of the loop in terms of Big-O notation.
There are four critical importance to learn data structures and algorithms: 1) An algorithm is a technology 2) It is at the core of library functions and several APIs 3) For cracking the coding interview 4) Algorithms are beautiful! This blog answer one of the critical questions: how do we develop a long-term motivation to learn data structures and algorithms?
Subscribe to get free weekly content on data structure and algorithms, machine learning, system design, oops and math. enjoy learning!