Linear Algebra II HT23, Spectral theorem
Flashcards
Let $A$ be an $n\times n$ real symmetric matrix. How many roots does $\chi _ A(t)$ have, counted with multiplicity, and hence how many eigenvalues?
When proving that $A \in \mathcal{S}^n$ has a characteristic polynomial with $n$ real roots, you argue by contradiction. How do you rearrange $\overline{\mathbf v} \cdot \mathbf A \overline{ \mathbf v }$, using the fact that $\mathbf A$ is symmetric?
Can you state the spectral theorem for real symmetric matrices?
A real symmetric matrix $A \in \mathcal{S}^n$ has $n$ real eigenvalues and there exists an orthonormal basis for $\mathbb R^n$ consisting of eigenvectors for $A$.
If $A \in \mathcal S^n$ is a symmetric $n \times n$ matrix, then the spectral theorem states that $\mathbb R^n$ has an orthonormal basis consisting of eigenvectors of $A$. If you let the matrix consisting of eigenvectors be $P$, so that $\Lambda = P^{-1}AP$ is diagonal, what is true about $P$, since it contains only orthongonal vectors?
Let $V$ ve a real vector space with inner product $\langle \cdot, \cdot \rangle$. What does it mean for a linear map $T$ to be self-adjoint (or symmetric)?
For all $u, v \in \mathbb R^n$
\[\langle Tu, v \rangle = \langle u, T v \rangle\]Can you state the spectral theorem for self-adjoint operators on a real inner product space?
A self-adjoint map $T$ on a finite dimensional real inner product space $V$ has real eigenvalues and there exists an orthonormal basis for $V$ consisting of eigenvectors of $T$.
When proving that if $\mathbf A \in \mathbb{R}^{n\times n}$ is a symmetric matrix, then it has real eigenvalues, what are the two ways you rearrange
\[(Av)^\intercal \overline{v}\]
when $v$ is an eigenvector with eigenvalue $\lambda$?
On one hand
\[\begin{aligned} (Av)^\intercal \overline{v} &= v^\intercal A^\intercal \overline{v} \\\\ &= v^\intercal \overline{Av} \\\\ &=\overline \lambda v^\intercal \overline v \end{aligned}\]but also
\[\begin{aligned} (Av)^\intercal \overline{v} &= \lambda v^\intercal \overline{v} \\\\ \end{aligned}\]When proving the spectral theorem for a real symmetric matrix $A$, you need to show that
- $A$ has real eigenvalues
- There exists an orthonormal basis of $\mathbb R^n$ consisting of eigenvectors of $A$.
What equivalent condition to (2) do we use when proving the spectral theorem?
There exists an orthogonal matrix $R$ such that $R^{-1} A R$.
When proving the spectral theorem for a real symmetric matrix $A$ in an inner product space $V$, how do you construct an orthonormal basis $v _ 1, \ldots, v _ n$ that you then go on to show actually consists of eigenvectors?
Pick any eigenvalue $\lambda _ 1$ of $A$. Then arbitrarily extend to a basis of $V$ and apply the Gram-Schmidt procedure to get an orthonormal basis.
When proving the spectral theorem for a symmetric matrix $A$ in an inner product space $V$, you can construct an orthonormal basis $v _ 1, \ldots, v _ n$ for $V$ where $v _ 1$ is a unit eigenvector of $A$. How do you define $B$, and why does this help you prove the theorem by induction?
where
\[P = [\pmb v_1, \ldots, \pmb v_n]\]and $C$ is an $(n-1) \times (n-1)$ matrix we wish to show is diagonal. By induction, we can assume that there exists a matrix $Q$ such that $Q^{-1} C Q$ and then use this to construct a matrix $R$ that satisfies the property $R^{-1} A R$ is diagonal.
When proving the spectral theorem for a symmetric matrix $A$, you define $B$ as
\[B = P^\intercal A P = \begin{pmatrix}\lambda_1 & 0 \\\\0 & C\end{pmatrix}\]
where
\[P = [\pmb v_1, \ldots, \pmb v_n] \text{ (eigenvectors of A)}\]
and $C$ is an $(n-1) \times (n-1)$ matrix we wish to show is diagonal. By the induction on spectral theorem, what can we assume about $C$?
There exists an orthonormal matrix $Q$ such that
\[Q^{-1} C Q = D\]where $D$ is diagonal.
When proving the spectral theorem for a symmetric matrix $A$, you define $B$ as
\[B = P^\intercal A P = \begin{pmatrix}\lambda_1 & 0 \\\\0 & C\end{pmatrix}\]
where
\[P = [\pmb v_1, \ldots, \pmb v_n] \text{ (eigenvectors of A)}\]
and $C$ is an $(n-1) \times (n-1)$ matrix we wish to show is diagonal. By the induction on spectral theorem, we can assume that there exists $Q$ such that $Q^{-1} C Q$ is a diagonal matrix. Then how do we construct the final $R$ that is an orthonormal matrix such that that $R^{-1}AR$ is a diagonal matrix and what rwo things do we need to check?
Need to check
- $R^{-1} = R^\intercal$
- $R^{-1} A R$ is diagonal
Say you want to prove the spectral theorem for a real symmetric matrix $A$. You’ve done the step where you show it has real eigenvectors. Then what are the “ingredients” for the rest of the proof?
- $P = [\pmb v _ 1, \cdots, \pmb v _ n]$, (matrix consisting of arbitrary orthonormal extension of eigenvector $\pmb v _ 1$).
- $P^\intercal A P = \begin{pmatrix}\lambda _ 1 & 0 \\ 0 & C\end{pmatrix}$
- $Q^\intercal C Q = D$ (the inductive step)
- $R = P \begin{pmatrix}\lambda _ 1 & 0 \\ 0 & Q\end{pmatrix}$
Proofs
Prove that if $A \in \mathcal{S}^n$ (where $\mathcal S^n$ denotes the set of real $n \times n$ symmetric matrices), then $A$ has $n$ real eigenvalues and there exists an orthonormal basis for $\mathbb R^n$ consisting of eigenvectors for $A$ (the spectral theorem).