4.9
Here the main thing to notice is that we need a O([math]\displaystyle{ n^{k-1}logn }[/math]) solution.
For various values of k,
k Solution Time Complexity
1 O([math]\displaystyle{ n^0logn }[/math])
2 O([math]\displaystyle{ n^1logn }[/math])
3 O([math]\displaystyle{ n^2logn }[/math])
4 O([math]\displaystyle{ n^3logn }[/math])
for k = 2 onwards
1. sort the array ( nlogn )
2. use k-1 loops for iterating over array, where ith loop starts from i-1 + 1 element of array, and then use binary search
eg:
for k = 3
for( i = 0; i < n; i++ )
for( j = i+1; j < n; j++ )
- now use binary search to find ( T - a[i] - a[j] ) from j+1 element to end of array
Another recursive solution
First, we note that an O([math]\displaystyle{ n \lg n }[/math]) solution for [math]\displaystyle{ k = 2 }[/math] exists , which is already within the required bounds. Now, for [math]\displaystyle{ k \ge 3 }[/math], we can do something like this:
sort S ascending; CheckSumOfK(S, k, T); // defined below // S must be aorted array of integers function CheckSumOfK(S, k, T) if k <= 2: Use any method in the solution of ex 4.8 and return the result. This can be done in O(n) time because S is sorted. Initialize array A of n - 1 elements; for i from 0 to n - 1: k = 0 // Collect in A all numbers in S but S[i]. Note that A will still be sorted. for j from 0 to n - 1: if i != j: A[k] = S[j]; k = k + 1 // If S[i] is included in the k integers that sum up to T, then there must // exactly (k - 1) integers in the rest of S (i.e., A) that sum to (T - S[i]). // We can find that out by calling ourselves recursively. if CheckSumOfK(A, k - 1, T - S[i]): return True return False
Complexity
For each item in S, there is [math]\displaystyle{ O(n) }[/math] work done in assembling the array A, except at the [math]\displaystyle{ {k - 1}^{th} }[/math] recursive call, which completes in [math]\displaystyle{ O(n \lg n) }[/math] time. So, for each number in S, we have [math]\displaystyle{ O(kn) + O(n \lg n) }[/math] work, and since [math]\displaystyle{ k \lt = n }[/math], each iteration of the outer loop takes [math]\displaystyle{ O(n^2) + O(n \lg n) = O(n^2) }[/math] work. Now since the outer loop goes on for at most [math]\displaystyle{ n }[/math] iterations, we have a total runtime of [math]\displaystyle{ O(n^3) }[/math].
A trick can be used to lower this runtime to [math]\displaystyle{ O(n^2 \lg n) }[/math] by having the routines in this solution take an index to ignore when iterating in their inner loops. With this, we save the [math]\displaystyle{ O(n) }[/math] construction of A for every item, and then each iteration of the outer loop becomes [math]\displaystyle{ O(n \lg n) }[/math] (How? [math]\displaystyle{ O(k) = O(n) }[/math] constant time recursive calls plus [math]\displaystyle{ O(n \lg n) }[/math] time spent in the [math]\displaystyle{ {k - 1}^{th} }[/math] calls), giving a total runtime of [math]\displaystyle{ O(n^2 \lg n) }[/math] for [math]\displaystyle{ n }[/math] elements in S.
Note that this cost includes the initial [math]\displaystyle{ O(n \lg n) }[/math] cost for sorting S. Also, this algorithm is always going to be [math]\displaystyle{ O(n^{k-1} \lg n) }[/math] for all [math]\displaystyle{ k \gt = 2 }[/math]. The exercise just states an upper bound.
Back to Chapter 4