Difference between pages "Chapter 2" and "12.9"

From The Algorithm Design Manual Solution Wiki
(Difference between pages)
Jump to navigation Jump to search
 
(Created page with " Back to Chapter 12")
 
Line 1: Line 1:
=Algorithm Analysis=
 
  
===Program Analysis===
 
  
:[[2.1]]. What value is returned by the following function? Express your answer as a function of <math>n</math>. Give the worst-case running time using the Big Oh notation.
+
Back to [[Chapter 12]]
  mystery(''n'')
 
      r:=0
 
      ''for'' i:=1 ''to'' n-1 ''do''
 
          ''for'' j:=i+1 ''to'' n ''do''
 
              ''for'' k:=1 ''to'' j ''do''
 
                  r:=r+1
 
        ''return''(r)
 
 
 
[[2.1|Solution]]
 
 
 
 
 
:2.2. What value is returned by the following function? Express your answer as a function of <math>n</math>. Give the worst-case running time using Big Oh notation.
 
    pesky(n)
 
        r:=0
 
        ''for'' i:=1 ''to'' n ''do''
 
            ''for'' j:=1 ''to'' i ''do''
 
                ''for'' k:=j ''to'' i+j ''do''
 
                    r:=r+1
 
        ''return''(r)
 
 
 
 
 
:[[2.3]]. What value is returned by the following function? Express your answer as a function of <math>n</math>. Give the worst-case running time using Big Oh notation.
 
    prestiferous(n)
 
        r:=0
 
        ''for'' i:=1 ''to'' n ''do''
 
            ''for'' j:=1 ''to'' i ''do''
 
                ''for'' k:=j ''to'' i+j ''do''
 
                    ''for'' l:=1 ''to'' i+j-k ''do''
 
                        r:=r+1
 
        ''return''(r)
 
 
 
[[2.3|Solution]]
 
 
 
 
 
:2.4. What value is returned by the following function? Express your answer as a function of <math>n</math>. Give the worst-case running time using Big Oh notation.
 
  conundrum(<math>n</math>)
 
      <math>r:=0</math>
 
      ''for'' <math>i:=1</math> ''to'' <math>n</math> ''do''
 
      ''for'' <math>j:=i+1</math> ''to'' <math>n</math> ''do''
 
      ''for'' <math>k:=i+j-1</math> ''to'' <math>n</math> ''do''
 
      <math>r:=r+1</math>
 
      ''return''(r)
 
 
 
 
 
:[[2.5]]. Consider the following algorithm: (the print operation prints a single asterisk; the operation <math>x = 2x</math> doubles the value of the variable <math>x</math>).
 
    ''for'' <math> k = 1</math> to <math>n</math>
 
        <math>x = k</math>
 
        ''while'' (<math>x < n</math>):
 
          ''print'' '*'
 
          <math>x = 2x</math>
 
:Let <math>f(n)</math> be the complexity of this algorithm (or equivalently the number of times * is printed). Proivde correct bounds for <math> O(f(n))</math>, and <math>/Theta(f(n))</math>, ideally converging on <math>\Theta(f(n))</math>.
 
 
 
[[2.5|Solution]]
 
 
 
 
 
:2.6. Suppose the following algorithm is used to evaluate the polynomial
 
::::::<math> p(x)=a_n x^n +a_{n-1} x^{n-1}+ \ldots + a_1 x +a_0</math>
 
    <math>p:=a_0;</math>
 
    <math>xpower:=1;</math>
 
    for <math>i:=1</math> to <math>n</math> do
 
    <math>xpower:=x*xpower;</math>
 
    <math>p:=p+a_i * xpower</math>
 
#How many multiplications are done in the worst-case? How many additions?
 
#How many multiplications are done on the average?
 
#Can you improve this algorithm?
 
 
 
 
 
:2.7. Prove that the following algorithm for computing the maximum value in an array <math>A[1..n]</math> is correct.
 
  max(A)
 
      <math>m:=A[1]</math>
 
      ''for'' <math>i:=2</math> ''to'' n ''do''
 
            ''if'' <math>A[i] > m</math> ''then'' <math>m:=A[i]</math>
 
      ''return'' (m)
 
 
 
[[2.7|Solution]]
 
 
 
===Big Oh===
 
 
 
 
 
:2.8. True or False?
 
#Is <math>2^{n+1} = O (2^n)</math>?
 
#Is <math>2^{2n} = O(2^n)</math>?
 
 
 
 
 
:[[2.9]]. For each of the following pairs of functions, either <math>f(n )</math> is in <math>O(g(n))</math>, <math>f(n)</math> is in <math>\Omega(g(n))</math>, or <math>f(n)=\Theta(g(n))</math>. Determine which relationship is correct and briefly explain why.
 
#<math>f(n)=\log n^2</math>; <math>g(n)=\log n</math> + <math>5</math>
 
#<math>f(n)=\sqrt n</math>; <math>g(n)=\log n^2</math>
 
#<math>f(n)=\log^2 n</math>; <math>g(n)=\log n</math>
 
#<math>f(n)=n</math>; <math>g(n)=\log^2 n</math>
 
#<math>f(n)=n \log n + n</math>; <math>g(n)=\log n</math>
 
#<math>f(n)=10</math>; <math>g(n)=\log 10</math>
 
#<math>f(n)=2^n</math>; <math>g(n)=10 n^2</math>
 
#<math>f(n)=2^n</math>; <math>g(n)=3^n</math>
 
 
 
[[2.9|Solution]]
 
 
 
 
 
:2.10. For each of the following pairs of functions <math>f(n)</math> and <math>g(n)</math>, determine whether <math>f(n) = O(g(n))</math>, <math>g(n) = O(f(n))</math>, or both.
 
#<math>f(n) = (n^2 - n)/2</math>,  <math>g(n) =6n</math>
 
#<math>f(n) = n +2 \sqrt n</math>, <math>g(n) = n^2</math>
 
#<math>f(n) = n \log n</math>, <math>g(n) = n \sqrt n /2</math>
 
#<math>f(n) = n + \log n</math>, <math>g(n) = \sqrt n</math>
 
#<math>f(n) = 2(\log n)^2</math>, <math>g(n) = \log n + 1</math>
 
#<math>f(n) = 4n\log n + n</math>, <math>g(n) = (n^2 - n)/2</math>
 
 
 
 
 
:[[2.11]]. For each of the following functions, which of the following asymptotic bounds hold for <math>f(n) = O(g(n)),\Theta(g(n)),\Omega(g(n))</math>?
 
 
 
[[2.11|Solution]]
 
 
 
 
 
:2.12. Prove that <math>n^3 - 3n^2-n+1 = \Theta(n^3)</math>.
 
 
 
 
 
:2.13. Prove that <math>n^2 = O(2^n)</math>.
 
 
 
[[2.13|Solution]]
 
 
 
 
 
:2.14. Prove or disprove: <math>\Theta(n^2) = \Theta(n^2+1)</math>.
 
 
 
 
 
:[[2.15]]. Suppose you have algorithms with the five running times listed below. (Assume these are the exact running times.) How much slower do each of these inputs get when you (a) double the input size, or (b) increase the input size by one?
 
::(a) <math>n^2</math>  (b) <math>n^3</math>  (c) <math>100n^2</math>  (d) <math>nlogn</math>  (e) <math>2^n</math>
 
 
 
[[2.15|Solution]]
 
 
 
 
 
:2.16.  Suppose you have algorithms with the six running times listed below. (Assume these are the exact number of operations performed as a function of input size <math>n</math>.)Suppose you have a computer that can perform <math>10^10</math> operations per second. For each algorithm, what is the largest input size n that you can complete within an hour?
 
::(a) <math>n^2</math>  (b) <math>n^3</math>  (c) <math>100n^2</math>  (d) <math>nlogn</math>  (e) <math>2^n</math>  (f) <math>2^{2^n}</math>
 
 
 
 
 
:[[2.17]]. For each of the following pairs of functions <math>f(n)</math> and <math>g(n)</math>, give an appropriate positive constant <math>c</math> such that <math>f(n) \leq c \cdot g(n)</math> for all <math>n > 1</math>.
 
#<math>f(n)=n^2+n+1</math>, <math>g(n)=2n^3</math>
 
#<math>f(n)=n \sqrt n + n^2</math>, <math>g(n)=n^2</math>
 
#<math>f(n)=n^2-n+1</math>, <math>g(n)=n^2/2</math>
 
 
 
[[2.17|Solution]]
 
 
 
 
 
:2.18. Prove that if <math>f_1(n)=O(g_1(n))</math> and <math>f_2(n)=O(g_2(n))</math>, then <math>f_1(n)+f_2(n) = O(g_1(n)+g_2(n))</math>.
 
 
 
 
 
:[[2.19]]. Prove that if <math>f_1(N)=\Omega(g_1(n))</math> and <math>f_2(n)=\Omega(g_2(n) </math>, then <math>f_1(n)+f_2(n)=\Omega(g_1(n)+g_2(n))</math>.
 
 
 
[[2.19|Solution]]
 
 
 
 
 
:2.20. Prove that if <math>f_1(n)=O(g_1(n))</math> and <math>f_2(n)=O(g_2(n))</math>, then <math>f_1(n) \cdot f_2(n) = O(g_1(n) \cdot g_2(n))</math>
 
 
 
 
 
:[[2.21]]. Prove for all <math>k \geq 1</math> and all sets of constants <math>\{a_k, a_{k-1}, \ldots, a_1,a_0\} \in R</math>, <math> a_k n^k + a_{k-1}n^{k-1}+....+a_1 n + a_0 = O(n^k)</math>
 
 
 
[[2.21|Solution]]
 
 
 
 
 
:2.22. Show that for any real constants <math>a</math> and <math>b</math>, <math>b > 0</math>
 
<center><math>(n + a)^b = \Omega (n^b)</math></center>
 
 
 
 
 
:[[2.23]]. List the functions below from the lowest to the highest order. If any two or more are of the same order, indicate which.
 
<center>
 
<math>\begin{array}{llll}
 
n & 2^n & n \lg n & \ln n \\
 
n-n^3+7n^5 & \lg n & \sqrt n & e^n \\
 
n^2+\lg n & n^2 & 2^{n-1} &  \lg \lg n \\
 
n^3 & (\lg n)^2 & n! & n^{1+\varepsilon} where 0< \varepsilon <1
 
\\
 
\end{array}</math>
 
</center>
 
 
 
[[2.23|Solution]]
 
 
 
 
 
:2.24
 
 
 
 
 
:[[2.25]]
 
 
 
[[2.25|Solution]]
 
 
 
 
 
:2.26. List the functions below from the lowest to the highest order. If any two or more are of the same order, indicate which.
 
<center>
 
<math>\begin{array}{lll}
 
\sqrt{n} & n & 2^n \\
 
n \log n &  n - n^3 + 7n^5 &  n^2 + \log n \\
 
n^2 &  n^3 &  \log n \\
 
n^{\frac{1}{3}} + \log n & (\log n)^2 &  n! \\
 
\ln n & \frac{n}{\log n} &  \log \log n \\
 
({1}/{3})^n &  ({3}/{2})^n &  6 \\
 
\end{array}</math>
 
</center>
 
 
 
 
 
:[[2.27]]. Find two functions <math>f(n)</math> and <math>g(n)</math> that satisfy the following relationship. If no such <math>f</math> and <math>g</math> exist, write ''None.''
 
#<math>f(n)=o(g(n))</math> and <math>f(n) \neq \Theta(g(n))</math>
 
#<math>f(n)=\Theta(g(n))</math> and <math>f(n)=o(g(n))</math>
 
#<math>f(n)=\Theta(g(n))</math> and <math>f(n) \neq O(g(n))</math>
 
#<math>f(n)=\Omega(g(n))</math> and <math>f(n) \neq O(g(n))</math>
 
 
 
[[2.27|Solution]]
 
 
 
 
 
:2.28. True or False?
 
#<math>2n^2+1=O(n^2)</math>
 
#<math>\sqrt n= O(\log n)</math>
 
#<math>\log n = O(\sqrt n)</math>
 
#<math>n^2(1 + \sqrt n) = O(n^2 \log n)</math>
 
#<math>3n^2 + \sqrt n = O(n^2)</math>
 
#<math>\sqrt n \log n= O(n) </math>
 
#<math>\log n=O(n^{-1/2})</math>
 
 
 
 
 
:[[2.29]]. For each of the following pairs of functions <math>f(n)</math> and <math>g(n)</math>, state whether <math>f(n)=O(g(n))</math>, <math>f(n)=\Omega(g(n))</math>, <math>f(n)=\Theta(g(n))</math>, or none of the above.
 
#<math>f(n)=n^2+3n+4</math>, <math>g(n)=6n+7</math>
 
#<math>f(n)=n \sqrt n</math>, <math>g(n)=n^2-n</math>
 
#<math>f(n)=2^n - n^2</math>, <math>g(n)=n^4+n^2</math>
 
 
 
[[2.29|Solution]].
 
 
 
 
 
:2.30. For each of these questions, briefly explain your answer.
 
<br>
 
::(a) If I prove that an algorithm takes <math>O(n^2)</math> worst-case time, is it possible that it takes <math>O(n)</math> on some inputs?
 
<br>
 
::(b) If I prove that an algorithm takes <math>O(n^2)</math> worst-case time, is it possible that it takes <math>O(n)</math> on all inputs?
 
<br>
 
::(c) If I prove that an algorithm takes <math>\Theta(n^2)</math> worst-case time, is it possible that it takes <math>O(n)</math> on some inputs?
 
<br>
 
::(d) If I prove that an algorithm takes <math>\Theta(n^2)</math> worst-case time, is it possible that it takes <math>O(n)</math> on all inputs?
 
<br>
 
::(e) Is the function <math>f(n) = \Theta(n^2)</math>, where <math>f(n) = 100 n^2</math> for even <math>n</math> and <math>f(n) = 20 n^2 - n \log_2 n</math> for odd <math>n</math>?
 
 
 
 
 
:[[2.31]]. For each of the following, answer ''yes'', ''no'', or ''can't tell''. Explain your reasoning.
 
<br>
 
::(a) Is <math>3^n = O(2^n)</math>?
 
<br>
 
::(b) Is <math>\log 3^n = O( \log 2^n )</math>?
 
<br>
 
::(c) Is <math>3^n = \Omega(2^n)</math>?
 
<br>
 
::(d) Is <math>\log 3^n = \Omega( \log 2^n )</math>?
 
 
 
[[2.31|Solution]]
 
 
 
 
 
:2.32. For each of the following expressions <math>f(n)</math> find a simple <math>g(n)</math> such that <math>f(n)=\Theta(g(n))</math>.
 
#<math>f(n)=\sum_{i=1}^n {1\over i}</math>.
 
#<math>f(n)=\sum_{i=1}^n \lceil {1\over i}\rceil</math>.
 
#<math>f(n)=\sum_{i=1}^n \log i</math>.
 
#<math>f(n)=\log (n!)</math>.
 
 
 
 
 
:[[2.33]]. Place the following functions into increasing asymptotic order.
 
<math>f_1(n) = n^2\log_2n</math>,
 
<math>f_2(n) = n(\log_2n)^2</math>,
 
<math>f_3(n) = \sum_{i=0}^n 2^i</math>,
 
<math>f_4(n) = \log_2(\sum_{i=0}^n 2^i)</math>.
 
 
 
[[2.33|Solution]]
 
 
 
 
 
:2.34. Which of the following are true?
 
#<math>\sum_{i=1}^n 3^i = \Theta(3^{n-1})</math>.
 
#<math>\sum_{i=1}^n 3^i = \Theta(3^n)</math>.
 
#<math>\sum_{i=1}^n 3^i = \Theta(3^{n+1})</math>.
 
 
 
 
 
:[[2.35]]. For each of the following functions <math>f</math> find a simple function <math>g</math> such that <math>f(n)=\Theta(g(n))</math>.
 
#<math>f_1(n)= (1000)2^n + 4^n</math>.
 
#<math>f_2(n)= n + n\log n + \sqrt n</math>.
 
#<math>f_3(n)= \log (n^{20}) + (\log n)^{10}</math>.
 
#<math>f_4(n)= (0.99)^n + n^{100}.</math>
 
 
 
[[2.35|Solution]]
 
 
 
 
 
:2.36. For each pair of expressions <math>(A,B)</math> below,
 
indicate whether <math>A</math> is <math>O</math>, <math>o</math>, <math>\Omega</math>, <math>\omega</math>, or <math>\Theta</math> of
 
<math>B</math>.  Note that zero, one or more of these relations may hold for a
 
given pair; list all correct ones.
 
<br><math>
 
\begin{array}{lcc}
 
::        & A                    & B \\
 
::(a)    & n^{100}              & 2^n \\
 
::(b)    & (\lg n)^{12}        & \sqrt{n} \\
 
::(c)    & \sqrt{n}              & n^{\cos (\pi n/8)} \\
 
::(d)    & 10^n                  & 100^n \\
 
::(e)    & n^{\lg n}            & (\lg n)^n \\
 
::(f)    & \lg{(n!)}            & n \lg n
 
\end{array}
 
</math>
 
 
 
===Summations===
 
 
 
 
 
:[[2.37]]. Find an expression for the sum of the <math>i</math>th row of the following triangle, and prove its correctness. Each entry is the sum of the three entries directly above it. All non existing entries are considered 0.
 
<center>
 
<math>\begin{array}{ccccccccc}
 
&&&&1&&&& \\
 
&&&1&1&1&&&\\
 
&&1&2&3&2&1&&\\
 
&1&3&6&7&6&3&1&\\
 
1&4&10&16&19&16&10&4&1\\
 
\end{array}</math>
 
</center>
 
 
 
[[2.37|Solution]]
 
 
 
 
 
:2.38. Assume that Christmas has <math>n</math> days. Exactly how many presents did my ''true love'' send me? (Do some research if you do not understand this question.)
 
 
 
 
 
:[[2.39]]
 
 
 
[[2.39|Solution]]
 
 
 
 
 
:2.40. Consider the following code fragment.
 
<tt>
 
  for i=1 to n do
 
      for j=i to 2*i do
 
        output ''foobar''
 
</tt>
 
:Let <math>T(n)</math> denote the number of times `foobar' is printed as a function of <math>n</math>.
 
#Express <math>T(n)</math> as a summation (actually two nested summations).
 
#Simplify the summation.  Show your work.
 
 
 
 
 
:[[2.41]].Consider the following code fragment.
 
<tt>
 
  for i=1 to n/2 do
 
      for j=i to n-i do
 
        for k=1 to j do
 
            output ''foobar''
 
</tt>
 
:Assume <math>n</math> is even. Let <math>T(n)</math> denote the number of times `foobar' is printed as a function of <math>n</math>.
 
#Express <math>T(n)</math> as three nested summations.
 
#Simplify the summation.  Show your work.
 
 
 
[[2.41|Solution]]
 
 
 
 
 
:2.42. When you first learned to multiply numbers, you were told that <math>x \times y</math> means adding <math>x</math> a total of <math>y</math> times, so <math>5 \times 4 = 5+5+5+5 = 20</math>. What is the time complexity of multiplying two <math>n</math>-digit numbers in base <math>b</math> (people work in base 10, of course, while computers work in base 2) using the repeated addition method, as a function of <math>n</math> and <math>b</math>. Assume that single-digit by single-digit addition or multiplication takes
 
<math>O(1)</math> time. (Hint: how big can <math>y</math> be as a function of <math>n</math> and <math>b</math>?)
 
 
 
 
 
:[[2.43]]. In grade school, you learned to multiply long numbers on a digit-by-digit basis, so that <math>127 \times 211 = 127 \times 1 + 127 \times 10 + 127 \times 200 = 26,397</math>. Analyze the time complexity of multiplying two <math>n</math>-digit numbers with this method as a function of <math>n</math> (assume constant base size). Assume that single-digit by single-digit addition or multiplication takes <math>O(1)</math> time.
 
 
 
[2.43|Solution]]
 
 
 
===Logartihms===
 
 
 
 
 
:2.44
 
 
 
:2.45
 
 
 
:2.46
 
 
 
:2.47
 
 
 
===Interview Problems===
 
 
 
 
 
:2.48
 
 
 
:2.49
 
 
 
:2.50
 
 
 
:2.51
 
 
 
:2.52
 
 
 
:2.53
 
 
 
:2.54
 
 
 
:2.55
 
 
 
 
 
Back to [[Chapter List]]
 

Latest revision as of 21:09, 10 September 2020


Back to Chapter 12