Difficulty: Medium Asked-in: Amazon, Linkedin, Facebook
Key takeaway: An excellent problem to learn problem solving using sorting and two pointers approach.
Write a program to count the number of possible triangles that can be formed with three elements from a given unsorted array of positive integers. The three elements, representing the lengths of the sides of a triangle, must satisfy the triangle inequality: the sum of any two sides must be greater than the third side. In other words, x + y > z, y + z > x, and z + x > y must all be true for the triplet (x, y, z) to form a valid triangle.
Assume that all array elements are positive integers.
Input: X[]= [4, 6, 3, 7], Output = 3
Explanation: There are three triangles possible: {3, 4, 6}, {4, 6, 7} and {3, 6, 7}.
Input: X[] = [10, 21, 22, 100, 101, 200, 300], Output = 6
Explanation: There can be 6 possible triangles: {10, 21, 22}, {21, 100, 101}, {22, 100, 101}, {10, 100, 101}, {100, 101, 200} and {101, 200, 300}.
One basic solution is to consider every combination of three elements in the given array and check if they satisfy the triangle inequality (the sum of any two sides must be greater than the third side). As we do this, we also keep a count of the number of combinations that satisfy this condition.
To implement this, we use three nested loops:
For each iteration, we check if the three sides form a valid triangle. If they do, we increment the count by 1. After all iterations are completed, we return the total count of triangles found.
int triangleCount(int X[], int n)
{
int count = 0;
for (int i = 0; i < n - 2; i = i + 1)
{
for (int j = i + 1; j < n - 1; j = j + 1)
{
for (int k = j + 1; k < n; k = k + 1)
{
if (X[i] + X[j] > X[k]
&& X[i] + X[k] > X[j]
&& X[k] + X[j] > X[i])
{
count = count + 1;
}
}
}
}
return count;
}
The outer for loop iterates n - 2 times, the middle for loop iterates (n - i - 2) times and the inner for loop iterates (n - j - 1) times. The time complexity of each of the three loops is O(n). So the overall time complexity of this code is O(n) * O(n) * O(n) = O(n^3).
From another perspective, we are exploring all possible combinations of all three sides of a triangle. So the total number of possibilities = The number of ways to pick three elements among n elements = nC3 = n(n - 1)(n - 2)/3 = O(n^3).
We are using constant extra space, so space complexity is O(1).
The critical question now is: can we improve the time complexity further? Is it possible to preprocess the array and efficiently identify all possible triangles? One idea is to sort the given array. This is because if we consider a triplet (x, y, z) such that x ≤ y ≤ z, we only need to check that x + y > z in order to determine if the triangle formed by them is valid. The idea is simple: If z ≥ y and z ≥ x, then adding any number to z will always produce a sum that is greater than either x or y. Therefore, the inequalities z + x > y and z + y > x are satisfied implicitly by the property x ≤ y ≤ z.
To determine the number of elements that satisfy the inequality X[k] < X[i] + X[j], we first sort the array X[]. We then consider each pair of elements (X[i], X[j]) such that j > i. Since the array is sorted, as we traverse towards the right to select the index k, the value of X[k] can only increase or remain the same. This means that there is a right limit on the value of k such that the elements satisfy X[k] < X[i] + X[j]. Any elements beyond this value of k will not satisfy the inequality.
To find this right limit value of k, we can use binary search since the array X is sorted. Once we find the right limit value of k, we can conclude that all the elements in the range (j + 1, k - 1) (inclusive) satisfy the required inequality. So the count of elements satisfying the inequality will be (k - 1) - (j + 1) + 1 = k - j - 1.
// find the farthest index k of the third side
// whose length is less than the sum of the other two sides.
int binarySearch(int A[], int l, int r, int twoSidesSum)
{
while (l <= r)
{
int mid = (l + r) / 2;
if (A[mid] >= twoSidesSum)
r = mid - 1;
else
l = mid + 1;
}
return l;
}
int triangleCount(int X[], int n)
{
int count = 0;
sort(X, X + n);
for (int i = 0; i < n - 2; i++)
{
int k = i + 2;
for (int j = i + 1; j < n - 1; j++)
{
k = binarySearch(X, k, n - 1, X[i] + X[j]);
count = count + k - j - 1;
}
}
return count;
}
Another important point to keep in mind is that when we find the right limit index (k) for a specific pair (i, j), we don't need to start searching for the right limit (k) again from the index j + 2 when we choose a higher value of j for the same value of i. Instead, we can simply continue from where we left off for the last chosen value of j, at index k directly. This can help us to reduce the range of our binary search for k to shorter values, which in turn can improve the performance of our algorithm.
Suppose we are using some efficient O(nlogn) sorting algorithm. So overall time complexity = Time complexity of sorting + Total number of loop iterations * Time complexity of binary search = O(nlogn) + O(n^2) * O(logn) = O(nlogn) + O(n^logn) = O(n^logn). Suppose we are using iterative binary search and in-place sorting algorithms like heap sort. Space complexity = O(1).
In the above approach, we are applying binary search O(n^2) time to find the right limit of the index k for every pair of indices (i, j). Can we think to improve this further?Here is a better idea!
As array is sorted, we can find the right limit of k by traversing linearly from k = i + 2 and stopping at the first value of k that does not satisfy the inequality X[i] + X[j] > X[k]. The count of elements X[k] satisfying this inequality for a particular pair (i, j) is equal to k - j - 1.
To find the number of possible triangles with sides of lengths given in the array X, we can follow these steps:
Now we run a nested loop:
Inside the inner loop:
int triangleCount(int X[], int n)
{
int count = 0;
sort(X, X + n);
for (int i = 0; i < n - 2; i = i + 1)
{
int k = i + 2;
for (int j = i + 1; j < n - 1; j = j + 1)
{
// find the farthest index k of the third side
// whose length is less than the sum of the other two sides.
while (k < n && X[i] + X[j] > X[k])
k = k + 1;
count = count + (k - j - 1);
}
}
return count;
}
The code contains three nested loops, which may lead some to believe that its time complexity is O(n^3). However, by closer examination, we can determine that the complexity is actually better.
In the outermost loop, k is initialized with i + 2. This means that the innermost while loop will run n - i - 2 times at each iteration of the outermost loop. The total number of loop iterations for each iteration of the outer loop is equal to the total iterations of the innermost while loop plus the total iterations of the for loop, which is equal to (n - i - 2) + (n - i - 2) = 2n - 2i - 4 = O(n).
Therefore, the outer loop runs O(n) times, and each iteration, we perform an O(n) operation. The time complexity of the three nested loops is O(n)*O(n) = O(n^2), which is much more efficient than O(n^3).
If we use an efficient O(nlogn) sorting algorithm, the overall time complexity is O(nlogn) + O(n^2) = O(n^2). Since we use a constant extra variable, the space complexity depends on the space complexity of the sorting algorithm.
Suppose we have a sorted array, and we want to find all possible triangles. Now, we consider each element X[i] as the third side of a triangle and try to apply two-pointer approach to find the other two sides of the valid triangle in the subarray from l = 0 to r = i - 1.
For each iteration of the outer loop, we run two pointer inner loop to pick the other two sides of the triangle. Here both pointers are moving in the opposite direction from l = 0 to r = i - 1. Note: Now we run inner loop till l < r.
int triangleCount(int X[], int n)
{
sort(X, X + n);
int count = 0;
for (int i = n - 1; i >= 2; i = i - 1)
{
int l = 0;
int r = i - 1;
while (l < r)
{
if (X[l] + X[r] > X[i])
{
count = count + r - l;
r = r - 1;
}
else
l = l + 1;
}
}
return count;
}
Suppose we are using some efficient O(nlogn) sorting algorithm. So overall time complexity. = Time complexity of sorting + Time complexity of nested loop = O(nlogn) + O(n^2) = O(n^2). We are using constant extra variable, so space complexity depends on the space complexity of the sorting algorithm.
Enjoy learning, enjoy coding, enjoy algorithms!
Subscribe to get weekly content on data structure and algorithms, machine learning, system design and oops.