Notes - Linear Algebra II HT23, Eigenvectors and eigenvalues
Flashcards
Let $V$ be a vector space over $\mathbb R$ and let $T : V \to V$ be a linear map. What is the definition of an eigenvector of $T$?
A vector $v$ is an eignevector of $T$ if
\[Tv = \lambda v\]for some $\lambda \in \mathbb R$.
Let $V$ be a vector space over $\mathbb R$ and let $T : V \to V$ be a linear map. What is the definition of an eigenvalue of $T$?
$\lambda \in \mathbb R$ is an eigenvalue of $T$ if there is a $v \in V$ such that $v \ne 0$ and $Tv = \lambda v$.
What’s a quick proof that $\lambda$ is an eigenvalue of $T$ if and only if $\ker (T - \lambda I)\ne \{ 0 \}$?
What is the characteristic polynomial of a matrix $A \in \mathbb R^{n \times n}$?
$\chi _ A (t) = \det (A - t I)$.
What’s the standard way of finding the eigenvalues of a linear transformation/matrix $T$?
Finding the roots of the characteristic polynomial $\chi _ T(t)$
What are the roots of the characteristic polynomial $\chi _ A(t)$?
The eigenvalues of $A$.
If $\lambda _ 1, \ldots, \lambda _ n$ are the eigenvalues of $A$, then what is $\det A$?
If $\lambda _ 1, \ldots, \lambda _ n$ are the eigenvalues of $A$, then what is $\text{Tr}(A)$?
What do the first two and last terms of the characteristic polynomial $\chi _ A(t)$ look like for some matrix $A \in \mathbb R^{n \times n}$?
If a linear map $T : V \to V$ has a basis cosnsiting of eigenvectors of $T$, what does the matrix for $T$ look like with respect to that basis?
Can you state what it means for a linear map $T : V \to V$ to be diagonalisable?
$V$ has a basis consisting of the eigenvectors of $T$.
Can you compare the condition for a matrix $A$ to be diagonalisable or invertible in terms of its eigenvalues?
- Invertible, all eigenvalues nonzero.
- Diagonalisable, enough eigenvalues to form a basis for $V$.
If a matrix $A \in \mathbb R^{n \times n}$ has eigenvalues $\lambda _ 1, \ldots, \lambda _ m$ ($m \le n$), then what is true about the corresponding eigenvectors?
They are linearly independent.
If a matrix $A \in \mathbb{R}^{n\times n}$ is diagonalisable, what notation is commonly used for the diagonal matrix consisting of the eigenvalues of $A$ on the diagonal?
Let $A \in \mathbb R^{n\times n}$ be a diagonalisable matrix (i.e. it has enough eigenvectors to form a basis for $V$), $S$ be the matrix consisting of the corresponding eigenvectors as columns, and $\Lambda$ be the matrix with the eigenvalues of $A$ along the diagonal. How can you relate $\Lambda$ and $A$?
Let $A \in \mathbb R^{n\times n}$ be a diagonalisable matrix (i.e. it has enough eigenvectors to form a basis for $V$), $S$ be the matrix consisting of the corresponding eigenvectors as columns, and $\Lambda$ be the matrix with the eigenvalues of $A$ along the diagonal. What’s a quick proof that
\[\Lambda = S^{-1}AS\]
What is the definition of $E _ \lambda$, the eigenspace corresponding to $\lambda$?
What is the geometric multiplicity $g _ \lambda$ of an eigenvalue $\lambda$?
The dimension of $E _ \lambda$, the eigenspace of $\lambda$.
What is the algebraic multiplicity $a _ \lambda$ of an eigenvalue $\lambda$?
The multiplicity of $\lambda$ as a root of $\chi _ A(t)$.
What are the two ways that a matrix can fail to be diagonalisable over $\mathbb F$?
- $\chi _ A(t)$ doesn’t have $n$ roots over $\mathbb F$.
- $g _ \lambda < a _ \lambda$ (does not depend on $\mathbb F$).
What’s the only way a matrix can fail to be diagonalisable over $\mathbb C$?
for some eigenvalue $\lambda$.
When proving that if a matrix/linear transformation $T$ has $m$ distinct eigenvectors $v _ 1, \ldots, v _ m$ (corresponding to $\lambda _ 1, \ldots, \lambda _ m$), then $v _ 1, \ldots, v _ m$ are linearly independent, you argue by contradiction. What set of linearly dependent vectors do you apply the linear transformation $T - \lambda _ k I$ to?
where $k \le m$ is the smallest $k$ such that the set is still linearly dependent.
What two facts allow you to prove that if a linear transformation is diagonalisable, then for each eigenvalue $\lambda$, the geometric multiplicity $g _ \lambda$ is equal to the algebraic multiplicity $a _ \lambda$?
and
\[g_{\lambda_1} + \cdots + g_{\lambda_n} = a_{\lambda_1} + \cdots + a_{\lambda_n} = \dim V\]What’s the basic proof idea for proving that for any eigenvalue of $T$, the geometric multiplicity $g _ \lambda$ is less than or equal to the algebraic multiplicity $a _ \lambda$?
Consider a basis for $E _ \lambda$, extend to one for $V$, and then consider the transformation matrix of $T$ and $T - x\lambda$.
Proofs
Prove that if $\ker (T - \lambda I) \ne \{ 0 \}$, then $T - \lambda I$ is not invertible.
Todo.
Prove that the characteristic polynomial $\chi _ T(t)$ of a linear transformation $T$ is well-defined and does not depend on the choice of basis for the matrix representing it.
Todo.
Prove that if $A \in \mathbb{R}^{n \times n}$, then $\chi _ A(t)$ has the form
\[\begin{aligned}
\chi_A(t) = (-1)^n t^n &+ (-1)^{n-1}\text{Tr}(A) \\\\ &+ \ldots \\\\
&+ \det A
\end{aligned}\]
Todo.
Prove that if a matrix $A$ has eigenvalues $\lambda _ 1, \ldots, \lambda _ n$, then
\[\text{Tr}(A) = \sum_{i=1}^n \lambda_i\]
Todo.
Prove that if a matrix $A$ has eigenvalues $\lambda _ 1, \ldots, \lambda _ n$, then
\[\det A = \prod_{i=1}^n \lambda_i\]
Todo.
Prove that if a matrix/linear transformation $T$ has $m$ distinct eigenvectors $v _ 1, \ldots, v _ m$ (corresponding to $\lambda _ 1, \ldots, \lambda _ m$), then $v _ 1, \ldots, v _ m$ are linearly independent.
Todo.
Prove that a matrix $A \in \mathbb R^{n \times n}$ is diagonalisable if and only if there exists an invertible matrix $S$ such that $\Lambda = S^{-1}AS$ where $\Lambda$ is a diagonal.
Todo.
Prove that the geometric multiplicity $g _ \lambda$ of an eigenvalue $\lambda$ is always less than or equal to the algebraic multiplicity, $a _ \lambda$.
Todo.
Prove that if a linear transformation $T$ is diagonalisable, then for each eigenvalue $\lambda$, the geometric multiplicity $g _ \lambda$ is equal to the algebraic multiplicity $a _ \lambda$.
Todo.