Difficulty: Medium, Asked-in: Amazon, Microsoft, Goldman Sachs, Yahoo
Key takeaway: An excellent problem to visualize the use of the stack operations in problem-solving.
Given a stack, write a program to sort the stack in ascending order. We are not allowed to make any assumptions about how the stack is implemented. The only functions to be used are:
Example 1
Input Stack: [4, 3, 6, 5, 1, 2] -> top
Output Stack: [1, 2, 3, 4, 5, 6] -> top
Example 2
Input Stack: [4, 3, 2, 1] ->top
Output Stack: [1, 2, 3, 4] -> top
Sorting in an array can be possible in place because we can perform the comparison, shifting, and swapping operations easily using the indexes. But the stack is a LIFO data structure where the first inserted element should be deleted first. The critical question is: can we sort the stack in place? The answer is no because the only way to access or update some intermediate element of a stack is to pop out some other elements, which requires extra space elsewhere. Think!
Now one idea is clear: we need extra space to sort an stack. But the critical question is: why that extra space should be another stack? The answer is simple: during the push and pop operations from one stack to another, we need to maintain the order of elements. Think!
Solution Idea
The solution idea is: we use another stack to maintain the sorted order of the elements. We pop each element from the input stack and push it to the temporary stack one by one till the popped element is greater than the top element of the temporary stack.
When the current element is less than the top element of the temporary stack, we need to find its correct position to insert in the sorted order. For this, we pop all the elements from the temporary stack that are greater than the current element and place them into the input stack. Now we insert the current element into the temporary stack and push back all the previous popped elements from the input stack to the temp stack.
Here is a step by step explanation:
Solution Steps
Solution Pseudocode
stack sortStack(stack s)
{
stack tempStack
while (!s.empty())
{
int topInputStack = s.pop()
while (!tempStack.empty() && tempStack.top() > topInputStack)
{
int topTempStack = tempStack.pop()
s.push(topTempStack)
}
tempStack.push(topInputStack)
}
return tempStack
}
Time and space complexity analysis
We pop each element from the tempStack repeatedly to insert a new element. So we need to identify the worst-case scenario for which the above algorithm performs the maximum number of stack operations. Such a scenario occurs when the input stack is sorted in decreasing order. Why? Let's think!
Another analysis idea: we need to perform 1 push and pop operation in the 1st iteration, 2 push and pop operations in the 2nd iteration, and so on. In the last iteration, (n-1) sorted elements are present in the tempStack and 1 element in the input stack. So, we need to perform n push and pop operations to place the remaining element from the input stack to the tempStack in the sorted order.
Space complexity = O(n) as we are using a tempStack to store the sorted elements.
Solution Idea
This idea is similar to the recursive insertion sort, where we pop the top element from the stack and sort the remaining (n-1) size stack recursively using the same function. After sorting the (n-1) size stack, we use another helper function insertTop(s, topElement), to insert the top element into the current position of the sorted stack.
Implementation of the insertTop(s, topElement) function
Solution Pseudocode
void insertTop(stack s, int topElement)
{
if (isEmpty(s) || topElement > s.top()))
{
s.push(topElement)
return
}
int temp = s.pop()
insertTop(s, topElement)
s.push(temp)
}
void sortStack(stack s)
{
if (!isEmpty(s))
{
int topElement = s.pop()
sortStack(s)
insertTop(s, topElement)
}
}
Time and space complexity analysis
We are solving the sorting problem of size n by recursively solving the sorting problem of size n-1 and inserting the top element into the n-1 size sorted stack. In the worst case, the time complexity of inserting the top element would be O(n). Think!
Recurrence relation: T(n) = T(n-1) + O(n). This is similar to the recurrence relation of recursive insertion sort or the worst-case time complexity of the quicksort. So time complexity = O(n^2)
This algorithm will use the recursion call stack of size O(n) to maintain the sorted order of input. Space complexity = O(n)
Comparison of sorting algorithms based on different parameters helps us choose an effcient sorting method in various problem-solving scenarios. You will get an answer to the following questions: how to compare two sorting algorithms? Which sorting is best in terms of properties like efficiency, in-place, stability, online vs. offline, etc.
Given a string S[], write a program to sort it in decreasing order based on the frequency of the characters. The frequency of a character is the number of times it appears in the string. If two characters have the same frequency, whichever occurs earliest in S, must come first. In other words, the sorting must be stable.
Given n non-negative integers representing an elevation map where the width of each bar is 1, compute how much water it can trap after raining. This is a famous interview problem to learn time and space complexity optimization using various approaches. Two-pointers approach provides an efficient solution in O(n) time and O(1) space.
Subscribe to get free weekly content on data structure and algorithms, machine learning, system design, oops design and mathematics.