Notes - Numerical Analysis HT24, Eigenvalue problems
Flashcards
Suppose $f(x) = \sum^n _ {i = 0} c _ i x^i$ and $c _ n \ne 0$. How is the companion matrix for $f$ defined, first in the numerical analysis course and then in the rings and modules course?
Numerical analysis definition:
\[\begin{bmatrix} -\frac{c_{n-1} }{c_n} & -\frac{c_{n-2} }{c_n} & \cdots & -\frac{c_1}{c_n} & -\frac{c_0}{c_n} \\\\ 1 & 0 & \cdots & 0 & 0 \\\\ 0 & 1 & \cdots & 0 & 0 \\\\ \vdots & \vdots & \ddots & \vdots & \vdots \\\\ 0 & 0 & \cdots & 1 & 0 \end{bmatrix}\]Rings and modules definition:
\[\begin{bmatrix} 0 & 0 & \cdots & 0 & -\frac{c_0}{c_n} \\\\ 1 & 0 & \cdots & 0 & -\frac{c_1}{c_n} \\\\ 0 & 1 & \cdots & 0 & \vdots \\\\ \vdots & \vdots & \ddots & \vdots & \vdots \\\\ 0 & 0 & \cdots & 1 & -\frac{c_{n-1} }{c_n} \end{bmatrix}\]The companion matrix for $f(x) = \sum^n _ {i = 0} c _ i x^i$ is defined as
\[C = \begin{bmatrix} -\frac{c_{n-1} }{c_n} & -\frac{c_{n-2} }{c_n} & \cdots & -\frac{c_1}{c_n} & -\frac{c_0}{c_n} \\\\ 1 & 0 & \cdots & 0 & 0 \\\\ 0 & 1 & \cdots & 0 & 0 \\\\ \vdots & \vdots & \ddots & \vdots & \vdots \\\\ 0 & 0 & \cdots & 1 & 0 \end{bmatrix}\]
How can you quickly justify that the eigenvalues of $C$ are the roots of $f$?
If $p(\lambda) = 0$, then $Cx = \lambda x$ where
\[x = \begin{bmatrix}\lambda^{n-1} \\\\ \lambda^{n-2} \\\\ \vdots \\\\ \lambda \\\\ 1\end{bmatrix}\]since all but the top row shift the entries of the vectors “up by one”, and then at the top row we end up with $-\sum^{n-1} _ {i = 0} \frac{c _ {i} }{c _ n} \lambda^i = \lambda^n$ by the fact that $\lambda$ is a root.