Notes - DAA HT23, Divide-and-conquer
Flashcards
What are the three stages of a divide-and-conquer algorithm?
- Divide
- Conquer
- Combine
What is $T(n)$ for a general divide-and-conquer algorithm if:
- When $n \le l$, the problem takes constant time
- When $n > l$, we divide into $a$ subproblems, each size $1/b$ of the original, and it takes $D(n)$ time to divide the problem and $C(n)$ time to combine the solutions
?
$T(n)$ for a general divide-and-conquer algorithm looks like
\[T(n) = \begin{cases}
\Theta(1) &\text{if } n \le l, \\\\
aT(n/b) + D(n) + C(n) &\text{otherwise}
\end{cases}\]
if
- When $n \le l$, the problem takes constant time
- When $n > l$, we divide into $a$ subproblems, each size $1/b$ of the original, and it takes $D(n)$ time to divide the problem and $C(n)$ time to combine the solutions
What does $T(n)$ look like for $\text{Merge-Sort}$?
Can you state the Master theorem in full?
Let $a, b \ge 1$ be constants, let $f(n)$ be a function, and let $T(n)$ be defined by the recurrence
\[T(n) = aT(n/b) + f(n)\]where $n/b$ can be $\lfloor n/b \rfloor$ or $\lceil n/b \rceil$. Then $T(n)$ has the following asympototic bounds:
- If $f(n) = O(n^{\log _ b a - \epsilon})$ for $\epsilon > 0$ then $T(n) = \Theta(n^{\log _ b a})$.
- If $f(n) = \Theta(n^{\log _ b a})$, then $T(n) = \Theta(n^{\log _ b a} \log n)$.
- If $f(n) = \Omega(n^{\log _ b a + \epsilon})$ for some $\epsilon > 0$, and if $af(n /b) \le cf(n)$ for some constant $c < 1$ and all sufficiently large $n$, then $T(n) = \Theta(f(n))$.
What does the Master theorem say about
\[T(n) = aT(n/b) + f(n)\]
for some constants $a,b \ge 1$ when $f(n) = O(n^{\log _ b a - \epsilon})$ for some $\epsilon > 0$?
What does the Master theorem say about
\[T(n) = aT(n/b) + f(n)\]
for some constants $a,b \ge 1$ when $f(n) = \Theta(n^{\log _ b a})$?
What does the Master theorem say about
\[T(n) = aT(n/b) + f(n)\]
for some constants $a,b \ge 1$ when $f(n) = \Omega(n^{\log _ b a + \epsilon})$ for some $\epsilon > 0$ and $af(n/b) \le cf(n)$ for some constant $c<1$ and sufficiently large $n$?
How many cases are there about
\[T(n) = aT(n/b) + f(n)\]
in the Master theorem?
What time complexity $T(n)$ characterises the ‘ordinary‘ matrix multiplication algorithm (i.e. dot product of rows and columns)?
What recurrence relation on time complexity $T(n)$ characterises a naive divide-and-conquer algorithm for matrix muliplication?
A naïve divide-and-conquer algorithm has time complexity given by
\[T(n) = 8T(n/2) + O(n^2)\]
What does the Master theorem say the time complexity of this is?
What recurrence relation characterises Strassen’s matrix multiplication algorithm?
Strassen’s algorithm for matrix multiplication has time complexity given by
\[T(n) = 7T(n/2) + O(n^3)\]
What does the Master theorem say the time complexity of this is?
\[\begin{pmatrix}A \& B \\\\ C \& D\end{pmatrix}\begin{pmatrix}E \& F \\\\ G \& H\end{pmatrix}\]
Why is Strassen’s matrix multiplication algorithm faster than computing all 8 products $AE$, $BG$, $AF$, $BH$, etc.?
It breaks it down into only 7 matrix multiplications.
Suppose you have two $2n$ bit numbers $a$ and $b$. Let
\[\begin{aligned}
a &= 2^n a_0 + a_1 \\\\
b &= 2^n b_0 + b_1
\end{aligned}\]
Can you derive a way to calculate their product using only three multiplications of $n$-bit length?
Can you give an algorithm for finding a maximum in a cyclically sorted array?
def max_of_sorted(xs):
if len(xs) == 1:
return xs[0]
if xs[0] < xs[len(xs)//2]:
return max_of_sorted(xs[len(xs)//2:])
return max_of_sorted(xs[:len(xs)//2])
Can you state the weaker version of the master theorem for
\[T(n) = \begin{cases}
O(1) &\text{if } n =1 \\\\
aT(\lceil n/b\rceil) + O(n^d) &\text{if } n > 1
\end{cases}\]
?
By drawing a table with headings “level”, “tree” and “work”, can you give a non-recurrent formula for
\[T(n) = \begin{cases}
O(1) &\text{if } n =1 \\\\
aT(\lceil n/b\rceil) + O(n^d) &\text{if } n > 1
\end{cases}\]
assuming that $b$ is a power of $n$?
Suppose $r \ne 1$. Can you give the Big-O of the following geometric series
\[a + ar + ar^2 + \cdots + ar^{n-1}\]
breaking down by the cases where $r < 1$ and $r > 1$?
Given that
\[a + ar + ar^2 + \cdots + ar^{n-1} = \begin{cases}
O(a) &\text{if } r < 1 \\\\
O(ar^{n-1}) &\text{if } r > 1
\end{cases}\]
Can you give the Big-O of
\[\sum^{\log_b n}_{i=0} O(n^d) \left(\frac{a}{b^d}\right)^i\]
in the cases where $\frac{a}{b^d} < 1$, $\frac{a}{b^d} = 1$ and $\frac{a}{b^d} > 1$?
- $\frac{a}{b^d} < 1$, then $O(n^d)$
- $\frac{a}{b^d} = 1$, then $O(n^d \log _ b n)$
- $\frac{a}{b _ d} > 1$, then $O(n^{\log _ b a})$