Notes - DAA HT23, Divide-and-conquer


Flashcards

What are the three stages of a divide-and-conquer algorithm?


  1. Divide
  2. Conquer
  3. Combine

What is $T(n)$ for a general divide-and-conquer algorithm if:

  • When $n \le l$, the problem takes constant time
  • When $n > l$, we divide into $a$ subproblems, each size $1/b$ of the original, and it takes $D(n)$ time to divide the problem and $C(n)$ time to combine the solutions ?

\[T(n) = \begin{cases} \Theta(1) &\text{if } n \le l, \\\\ aT(n/b) + D(n) + C(n) &\text{otherwise} \end{cases}\]

$T(n)$ for a general divide-and-conquer algorithm looks like

\[T(n) = \begin{cases} \Theta(1) &\text{if } n \le l, \\\\ aT(n/b) + D(n) + C(n) &\text{otherwise} \end{cases}\]

if

  • When $n \le l$, the problem takes constant time
  • When $n > l$, we divide into $a$ subproblems, each size $1/b$ of the original, and it takes $D(n)$ time to divide the problem and $C(n)$ time to combine the solutions

What does $T(n)$ look like for $\text{Merge-Sort}$?


\[T(n) = \begin{cases} \Theta(1) &\text{if } n \le l \\\\ 2T(n/2) + \Theta(n) &\text{if } n > 1 \end{cases}\]

Can you state the Master theorem in full?


Let $a, b \ge 1$ be constants, let $f(n)$ be a function, and let $T(n)$ be defined by the recurrence

\[T(n) = aT(n/b) + f(n)\]

where $n/b$ can be $\lfloor n/b \rfloor$ or $\lceil n/b \rceil$. Then $T(n)$ has the following asympototic bounds:

  1. If $f(n) = O(n^{\log _ b a - \epsilon})$ for $\epsilon > 0$ then $T(n) = \Theta(n^{\log _ b a})$.
  2. If $f(n) = \Theta(n^{\log _ b a})$, then $T(n) = \Theta(n^{\log _ b a} \log n)$.
  3. If $f(n) = \Omega(n^{\log _ b a + \epsilon})$ for some $\epsilon > 0$, and if $af(n /b) \le cf(n)$ for some constant $c < 1$ and all sufficiently large $n$, then $T(n) = \Theta(f(n))$.

What does the Master theorem say about

\[T(n) = aT(n/b) + f(n)\]

for some constants $a,b \ge 1$ when $f(n) = O(n^{\log _ b a - \epsilon})$ for some $\epsilon > 0$?


\[T(n) = \Theta(n^{\log_b a})\]

What does the Master theorem say about

\[T(n) = aT(n/b) + f(n)\]

for some constants $a,b \ge 1$ when $f(n) = \Theta(n^{\log _ b a})$?


\[T(n) = \Theta(n^{\log_b a} \log n)\]

What does the Master theorem say about

\[T(n) = aT(n/b) + f(n)\]

for some constants $a,b \ge 1$ when $f(n) = \Omega(n^{\log _ b a + \epsilon})$ for some $\epsilon > 0$ and $af(n/b) \le cf(n)$ for some constant $c<1$ and sufficiently large $n$?


\[T(n) = \Theta(f(n))\]

How many cases are there about

\[T(n) = aT(n/b) + f(n)\]

in the Master theorem?


\[3\]

What time complexity $T(n)$ characterises the ‘ordinary‘ matrix multiplication algorithm (i.e. dot product of rows and columns)?


\[T(n) = \Theta(n^3)\]

What recurrence relation on time complexity $T(n)$ characterises a naive divide-and-conquer algorithm for matrix muliplication?


\[T(n) = 8T(n/2) + O(n^2)\]

A naïve divide-and-conquer algorithm has time complexity given by

\[T(n) = 8T(n/2) + O(n^2)\]

What does the Master theorem say the time complexity of this is?


\[\Theta(n^3)\]

What recurrence relation characterises Strassen’s matrix multiplication algorithm?


\[T(n) = 7T(n/2) + O(n^2)\]

Strassen’s algorithm for matrix multiplication has time complexity given by

\[T(n) = 7T(n/2) + O(n^3)\]

What does the Master theorem say the time complexity of this is?


\[\Theta(n^{\log 7})\]
\[\begin{pmatrix}A \& B \\\\ C \& D\end{pmatrix}\begin{pmatrix}E \& F \\\\ G \& H\end{pmatrix}\]

Why is Strassen’s matrix multiplication algorithm faster than computing all 8 products $AE$, $BG$, $AF$, $BH$, etc.?


It breaks it down into only 7 matrix multiplications.

Suppose you have two $2n$ bit numbers $a$ and $b$. Let

\[\begin{aligned} a &= 2^n a_0 + a_1 \\\\ b &= 2^n b_0 + b_1 \end{aligned}\]

Can you derive a way to calculate their product using only three multiplications of $n$-bit length?


\[a\cdot b = (2^{2n} + 2^n) a_0 b_0 + 2^n (a_0 - a_1) (b_0 - b_1) + (2^n + 1) a_1b_1\]

Can you give an algorithm for finding a maximum in a cyclically sorted array?


def max_of_sorted(xs):
    if len(xs) == 1:
        return xs[0]

    if xs[0] < xs[len(xs)//2]:
        return max_of_sorted(xs[len(xs)//2:])

    return max_of_sorted(xs[:len(xs)//2])

Can you state the weaker version of the master theorem for

\[T(n) = \begin{cases} O(1) &\text{if } n =1 \\\\ aT(\lceil n/b\rceil) + O(n^d) &\text{if } n > 1 \end{cases}\]

?


\[T(n) = \begin{cases} O(n^d) &\text{if } d > \log_b a \\\\ O(n^d \log_b n) &\text{if } d = \log_b a \\\\ O(n^{\log_b a}) &\text{if } d < \log_b a \end{cases}\]

By drawing a table with headings “level”, “tree” and “work”, can you give a non-recurrent formula for

\[T(n) = \begin{cases} O(1) &\text{if } n =1 \\\\ aT(\lceil n/b\rceil) + O(n^d) &\text{if } n > 1 \end{cases}\]

assuming that $b$ is a power of $n$?


\[T(n) = \sum^{\log_b n}_{i=0} O(n^d) \left(\frac {a}{b^d}\right)^i\]

Suppose $r \ne 1$. Can you give the Big-O of the following geometric series

\[a + ar + ar^2 + \cdots + ar^{n-1}\]

breaking down by the cases where $r < 1$ and $r > 1$?


\[\begin{cases} O(a) &\text{if } r < 1 \\\\ O(ar^{n-1}) &\text{if } r > 1 \end{cases}\]

Given that

\[a + ar + ar^2 + \cdots + ar^{n-1} = \begin{cases} O(a) &\text{if } r < 1 \\\\ O(ar^{n-1}) &\text{if } r > 1 \end{cases}\]

Can you give the Big-O of

\[\sum^{\log_b n}_{i=0} O(n^d) \left(\frac{a}{b^d}\right)^i\]

in the cases where $\frac{a}{b^d} < 1$, $\frac{a}{b^d} = 1$ and $\frac{a}{b^d} > 1$?


  • $\frac{a}{b^d} < 1$, then $O(n^d)$
  • $\frac{a}{b^d} = 1$, then $O(n^d \log _ b n)$
  • $\frac{a}{b _ d} > 1$, then $O(n^{\log _ b a})$



Related posts