Notes - Linear Algebra MT23, Minimal and characteristic polynomials


Flashcards

Let $A$ be a matrix. What is the minimal polynomial for $A$, denoted $m _ A(x)$?


The monic polynomial of least degree such that $p(A) = 0$.

Quickly prove that for any linear transformation $A \in \text{End}(V)$, there exists an annihilating polynomial $f(x)$.


Note that $\dim _ {\mathbb F} A = n^2$, so the $n^2 + 1$ elements $I, A, A^2, \ldots, A^{(n^2)}$ are linearly dependent. Then there exists a non-trivial linear combination to get $0$.

Quickly prove that if $f(x)$ annihilates $A$, then it also annihilates all matrices conjugate to $A$.


\[(P^{-1}AP)^i = P^{-1}A^iP\]

so

\[f(P^{-1}AP) = P^{-1}f(A)P = 0\]

Suppose $A \in \text{End}(V)$ and $f(x)$ annihilates $A$. What can you say about the relationship between $f(x)$ and $m _ A(x)$.


\[m_A(x) \mid f(x)\]

Quickly prove that if $A \in \text{End}(V)$ and $f(x)$ is an annihilating polynomial for $A$, then $m _ A(x) \mid f(x)$.


Use the Euclidean algorithm, and note that

\[f(A) = q(A)m_A(A) + r(A)\]

and as $\deg r < \deg m _ A$, it must be that $r(x) = 0$.

Suppose $A \in \text{End}(V)$. The polynomials that annihilate $A$ are the principal ideal described by the kernel of what homomorphism?


\[\mathbb F[x] \to M_{n \times n} (\mathbb F)\]

where

\[f(x) \mapsto f(A)\]

What relationship links the roots of $\chi _ A(t)$ and $m _ A(t)$?


\[\lambda \text{ root of } \chi_A(t) \iff \lambda \text{ root of } m_A(t)\]

Quickly prove that for $A : V \to V$

\[\lambda \text{ root of } \chi_A(x) \iff \lambda \text{ root of } m_A(x)\]

Forward direction: Since $\lambda$ is a root of $\chi _ A(\lambda)$, there exists $v \ne 0$ such that $Av = \lambda v$. Applying $m _ A$ to both sides and using the fact it’s a polynomial, we see $m _ A(A)v = m _ A(\lambda)v$ but $m _ A(A) = 0$, so it must be be the case that $m _ A(\lambda) = 0$ (otherwise we somehow get $0$ by multiplying by a non-zero scalar).

Backward direction: Suppose $\lambda$ is a root of $m _ A(x)$. Then $m _ A(x) = (x- \lambda) g(x)$ where $g(A) \ne 0$ (by minimality). Then $\exists w \ne 0$. Then note

\[(A - \lambda I) g(A)w = m_A(A)w = 0\]

so $g(A)w$ is an eigenvector for $A$, hence a root of $\chi _ A(x)$.

Suppose $\lambda$ is an eigenvalue of $A$, i.e. $\exists v \ne 0$ where $Av = \lambda v$. Then what can you say about the relationship between $f(A)$ and $f(\lambda)$?


\[f(A)v = f(\lambda)v\]

Suppose:

  • $A$ is a square matrix
  • $f(x)$ is a polynomial

What can you deduce about $f(A^\top)$, and what fact does this let you deduce about the minimal polynomial?


\[f(A^\top) = (f(A))^\top\]

Thus

\[f(A) = 0 \iff f(A^\top) = 0\]

Hence

\[m_A(x) = m_{A^\top}(x)\]

Quickly prove that for the companion matrix for $f(x) = a _ 0 + \cdots + a _ {n-1}x^{n-1} + x^n$ given by

\[C = \left(\begin{array}{cccccc} 0 & 1 & 0 & \cdots & 0 \\\\ 0 & 0 & 1 & \cdots & 0 \\\\ \vdots & \vdots & \vdots & \ddots & \vdots \\\\ 0 & 0 & 0 & \cdots & 1 \\\\ -a_0 & -a_1 & -a_2 & \cdots & -a_{n-1} \end{array}\right)\]

The minimal polynomial is $f(x)$.


Consider

\[\begin{pmatrix}1 & 0 & \cdots & 0 \end{pmatrix} \sum^{n-1}_ {k=0} b_ k C^k = \begin{pmatrix}b_ 0 & b_ 1 & \cdots & b_ {n-1} \end{pmatrix}\]

So that $g(C) \ne 0$ for all polynomials $g$ with $\deg g < n$. Since

\[\begin{pmatrix}1 & 0 & \cdots & 0 \end{pmatrix} C^n = \begin{pmatrix}-a_ 0& -a_ 1 & \cdots & -a_ {n-1} \end{pmatrix}\]

We need $b _ i = a _ i$, so the minimal polynomial and $f$ agree.

Suppose:

  • $V$ is a finite dimensional vector space over $\mathbb C$
  • $T _ 1 : V \to V$, $T _ 2 : V \to V$ are diagonalisable linear transformations
  • $T _ 1 T _ 2 = T _ 2 T _ 1$

Quickly prove that $T _ 1 + T _ 2$ is also diagonalisable.


Let $\lambda _ 1, \cdots, \lambda _ k$ be the eigenvalues of $T _ 1$. Then we have the decomposition

\[V = V_{\lambda_1} \oplus \cdots \oplus V_{\lambda_2}\]

If $v \in V _ {\lambda _ i}$, then $T _ 1(T _ 2 v) = T _ 2(T _ 1 v) = \lambda _ i T _ 2(v)$.

In other words, $T _ 2(V _ {\lambda _ i}) \subseteq V _ {\lambda _ i}$. So we may consider the restriction

\[T_2 |_{V_{\lambda_i} } : V_{\lambda_i} \to V_{\lambda_i}\]

But this is a diagonalisable linear transformation (consider the fact that the minimal polynomial must still be a product of linear factors), so $T _ 2$ is diagonaliable on $V _ {\lambda _ i}$.

But on $V _ {\lambda _ i}$, $T _ 1 + T _ 2$ is of the form $\lambda _ i I + T _ 2$ for all bases, so by diagonalising $T _ 2$, we can diagonalise $T _ 1 + T _ 2$ on $V _ {\lambda _ i}$.

By doing this for each subspace in the decomposition $V = V _ {\lambda _ 1} \oplus \cdots \oplus V _ {\lambda _ 2}$, we see that $T _ 1 + T _ 2$ is diagonalisable.

Suppose that we have a matrix

\[\begin{pmatrix} i & 1 \\\\ 0 & i \end{pmatrix}\]

over a basis $\{u, v\}$ which, when viewed as a real matrix over the basis $\{u, iu, v, iv\}$ is given by

\[\begin{pmatrix} 0 & -1 & 1 & 0 \\\\ 0 & 0 & 0 & 1 \\\\ 0 & 0 & 0 & -1 \\\\ 0 & 0 & 1 & 0 \end{pmatrix}\]

The minimal polynomial of the small matrix is $(x - i)^2$. How can you quickly deduce the minimal polynomial of the big matrix over $\mathbb R$?


Note that $(x-i)^2 (x+i)^2 = (x^2 + 1)^2$ is a real polynomial that annihilates $T$. So the minimal polynomial must divide $(x^2 + 1)^2$, and in fact it is equal since otherwise $T^2 + I$ would be $0$, which it is not.

Quickly prove that if $f(A) = 0$ then $m _ A \mid f$ and show that $m _ A$ is unique.


  • The first part follows from the division algorithm applied to $f / m _ A$, and noticing that $r = 0$ for it to annihilate
  • The second part follows from the fact that if $m _ A$ and $m _ A’$ are two minimal polynomials, $m _ A \mid m _ A’$ and $m _ A’ \mid m _ A$.



Related posts