Difference between revisions of "TADM2E 4.10"

From Algorithm Wiki
Jump to: navigation, search
(Replaced content with "Later")
(Undo revision 776 by FuckYou (talk))
Line 1: Line 1:
Later
+
Here the main thing to notice is that we need a O(<math>n^{k-1}logn</math>) solution. <br />
 +
For various values of k,  <br />
 +
k      Solution Time Complexity <br />
 +
1      O(<math>n^0logn</math>) <br />
 +
2      O(<math>n^1logn</math>) <br />
 +
3      O(<math>n^2logn</math>) <br />
 +
4      O(<math>n^3logn</math>) <br />
 +
 
 +
for k = 2 onwards <br />
 +
1. sort the array ( nlogn ) <br />
 +
2. use k-1 loops for iterating over array, where ith loop starts from i-1 + 1 element of array, and then use binary search <br />
 +
eg: <br />
 +
for k = 3 <br />
 +
for( i = 0; i < n; i++ ) <br />
 +
for( j = i+1; j < n; j++ ) <br />
 +
# now use binary search to find ( T - a[i] - a[j] ) from j+1 element to end of array <br />
 +
 
 +
<h3>Another recursive solution</h3>
 +
First, we note that [[ TADM2E 4.8 | an O(<math>n \lg n</math>) solution for <math>k = 2</math> exists ]], which is already within the required bounds. Now, for <math>k \ge 3</math>, we can do something like this:
 +
 
 +
<pre>
 +
sort S ascending;
 +
CheckSumOfK(S, k, T); // defined below
 +
 
 +
// S must be aorted array of integers
 +
function CheckSumOfK(S, k, T)
 +
    if k <= 2:
 +
        Use any method in the solution of ex 4.8 and return the result. This can be done in O(n) time because S is sorted.
 +
 
 +
    Initialize array A of n - 1 elements;
 +
    for i from 0 to n - 1:
 +
        k = 0
 +
        // Collect in A all numbers in S but S[i]. Note that A will still be sorted.
 +
        for j from 0 to n - 1:
 +
            if i != j:
 +
                A[k] = S[j];
 +
                k = k + 1
 +
        // If S[i] is included in the k integers that sum up to T, then there must
 +
        // exactly (k - 1) integers in the rest of S (i.e., A) that sum to (T - S[i]).
 +
        // We can find that out by calling ourselves recursively.
 +
        if CheckSumOfK(A, k - 1, T - S[i]):
 +
            return True
 +
    return False
 +
</pre>
 +
 
 +
<h3>Complexity</h3>
 +
For each item in S, there is <math>O(n)</math> work done in assembling the array A, except at the <math>{k - 1}^{th}</math> recursive call, which completes in <math>O(n \lg n)</math> time. So, for each number in S, we have <math>O(kn) + O(n \lg n)</math> work, and since <math>k <= n</math>, each iteration of the outer loop takes <math>O(n^2) + O(n \lg n) = O(n^2)</math>
 +
work. Now since the outer loop goes on for at most <math>n</math> iterations, we have a total runtime of <math>O(n^3)</math>.
 +
 
 +
A trick can be used to lower this runtime to <math>O(n^2 \lg n)</math> by having the routines in [[ TADM2E 4.8 | this solution ]] take an index to ignore when iterating in their inner loops. With this, we save the <math>O(n)</math> construction of A for every
 +
item, and then each iteration of the outer loop becomes <math>O(n \lg n)</math> (How? <math>O(k) = O(n)</math> constant time recursive calls plus <math>O(n \lg n)</math> time spent in the <math>{k - 1}^{th}</math> calls), giving a total runtime of <math>O(n^2 \lg n)</math> for <math>n</math> elements in S.
 +
 
 +
Note that this cost includes the initial <math>O(n \lg n)</math> cost for sorting S. Also, this algorithm is always going to be <math>O(n^{k-1} \lg n)</math> for all <math>k >= 2</math>. The exercise just states an upper bound.

Revision as of 14:27, 23 July 2020

Here the main thing to notice is that we need a O($ n^{k-1}logn $) solution.
For various values of k,
k Solution Time Complexity
1 O($ n^0logn $)
2 O($ n^1logn $)
3 O($ n^2logn $)
4 O($ n^3logn $)

for k = 2 onwards
1. sort the array ( nlogn )
2. use k-1 loops for iterating over array, where ith loop starts from i-1 + 1 element of array, and then use binary search
eg:
for k = 3
for( i = 0; i < n; i++ )
for( j = i+1; j < n; j++ )

  1. now use binary search to find ( T - a[i] - a[j] ) from j+1 element to end of array

Another recursive solution

First, we note that an O($ n \lg n $) solution for $ k = 2 $ exists , which is already within the required bounds. Now, for $ k \ge 3 $, we can do something like this:

sort S ascending;
CheckSumOfK(S, k, T); // defined below

// S must be aorted array of integers
function CheckSumOfK(S, k, T)
    if k <= 2:
        Use any method in the solution of ex 4.8 and return the result. This can be done in O(n) time because S is sorted.

    Initialize array A of n - 1 elements;
    for i from 0 to n - 1:
        k = 0
        // Collect in A all numbers in S but S[i]. Note that A will still be sorted.
        for j from 0 to n - 1:
            if i != j:
                A[k] = S[j];
                k = k + 1
        // If S[i] is included in the k integers that sum up to T, then there must
        // exactly (k - 1) integers in the rest of S (i.e., A) that sum to (T - S[i]).
        // We can find that out by calling ourselves recursively.
        if CheckSumOfK(A, k - 1, T - S[i]):
            return True
    return False

Complexity

For each item in S, there is $ O(n) $ work done in assembling the array A, except at the $ {k - 1}^{th} $ recursive call, which completes in $ O(n \lg n) $ time. So, for each number in S, we have $ O(kn) + O(n \lg n) $ work, and since $ k <= n $, each iteration of the outer loop takes $ O(n^2) + O(n \lg n) = O(n^2) $ work. Now since the outer loop goes on for at most $ n $ iterations, we have a total runtime of $ O(n^3) $.

A trick can be used to lower this runtime to $ O(n^2 \lg n) $ by having the routines in this solution take an index to ignore when iterating in their inner loops. With this, we save the $ O(n) $ construction of A for every item, and then each iteration of the outer loop becomes $ O(n \lg n) $ (How? $ O(k) = O(n) $ constant time recursive calls plus $ O(n \lg n) $ time spent in the $ {k - 1}^{th} $ calls), giving a total runtime of $ O(n^2 \lg n) $ for $ n $ elements in S.

Note that this cost includes the initial $ O(n \lg n) $ cost for sorting S. Also, this algorithm is always going to be $ O(n^{k-1} \lg n) $ for all $ k >= 2 $. The exercise just states an upper bound.