NLA MT25, Singular value decomposition
-
[[Course - Numerical Linear Algebra MT25]]U
- [[Notes - NLA MT25, Eigenvalue decomposition]]U
- The majority of results about SVD are contained in:
- [[Notes - Numerical Analysis HT24, Singular value decomposition]]U
- See also:
- [[Notes - Machine Learning MT23, Singular value decomposition]]U
Flashcards
@State the SVD theorem.
Suppose $A \in \mathbb R^{m \times n}$ is a matrix. Then $A$ has the decomposition
\[A = U \Sigma V^\top\]where:
- The columns of $U$ are the “left singular vectors”
- The columns of $V$ are the “right singular vectors”
- $U^\top U = V^\top V = I _ n$
- $\Sigma = \text{diag}(\sigma _ 1, \ldots, \sigma _ n)$
- $\sigma _ 1 \ge \cdots \ge \sigma _ n \ge 0$ (the “singular values”)
The ∆svd-theorem states that every matrix $A$ has the decomposition
\[A = U \Sigma V^\top\]
where:
- The columns of $U$ are the “left singular vectors”
- The columns of $V$ are the “right singular vectors”
- $U^\top U = V^\top V = I _ n$
- $\Sigma = \text{diag}(\sigma _ 1, \ldots, \sigma _ n)$
- $\sigma _ 1 \ge \cdots \ge \sigma _ n \ge 0$ (the “singular values”)
How can you interpret this geometrically?
Any matrix can bethought of as a rotation / reflection, followed by a magnification / shrinkage, followed by another rotation / reflection.
Example SVD calculations
Find the SVD of
\[A = \begin{bmatrix}
-1 & -2 \\
2 & 1 \\
1 & 0 \\
0 & 1
\end{bmatrix}\]
Compute the Gram matrix:
\[A^\top A = \begin{bmatrix} 6 & 4 \\ 4 & 6 \end{bmatrix}\]Calculate the eigenvalues:
\[\chi _ A(\lambda) = (\lambda - 6)^2 - 4^2 = 0\]so eigenvalues are $10$ and $2$.
Calculate eigenvector matrix:
\[V = \frac{1}{\sqrt 2} \begin{bmatrix} 1 & -1 \\ 1 & 1 \end{bmatrix}\]Hence
\[A^\top A = V \Sigma^2 V^\top\]where
\[\Sigma^2 = \begin{bmatrix} 10 & \\ & 2 \end{bmatrix}\]Calculate $U$:
\[U = AV\Sigma^{-1} = \begin{bmatrix} -3/\sqrt{20} & -1/2 \\ \phantom{-}3/\sqrt{20} & -1/2 \\ \phantom{-}1/\sqrt{20} & -1/2 \\ \phantom{-}1/\sqrt{20} & \phantom{-}1/2 \end{bmatrix}\]@example~
@Prove the ∆svd-properties, e.g. that if $A$ has the singular value decomposition
\[A = U\Sigma V^\top\]
then:
-
The rank $r$ of $A$ is the number of non-zero singular values of $A$, which is the number of non-zero entries in $\Sigma$
-
A basis for the column space of $A$ is given by the first $r$ columns of $U$
-
A basis for the column space of $A^\top$ is given by the first $r$ columns of $V$
-
A basis for the null space of $A$ is given by the last $n - r$ columns of $V$
-
A basis for the null space of $A^\top$ is given by the last $n - r$ columns of $U$
@todo
@Prove that if $A \in \mathbb R^{m \times n}$ with SVD $A = U \Sigma V^\top$, then:
- $V$ is an eigenvector matrix of $A^\top A$
- $U$ is an eigenvector matrix of $A A^\top$
- $\sigma _ i = \sqrt{\lambda _ i (A^\top A)}$
@todo
Suppose that $A$ is symmetric. @State a result about the connection between its eigenvalues $\lambda _ i$ and its singular values $\sigma _ i$.
@Prove that if $A$ is symmetric, then
\[\sigma _ i (A) = \vert \lambda _ i(A) \vert\]
@todo
Suppose that $A$ is unitary. @State a result about the connection between its eigenvalues and its singular values.
@todo, also @todo proof
Suppose that $A$ is skew-symmetric. @State a result about the connection between its eigenvalues and its singular values.
@todo, also @todo proof
Suppose that $A$ is normal. @State a result about the connection between its eigenvalues and its singular values.
@todo, also @todo proof
Suppose that $A$ is triangular. @State a result about the connection between its eigenvalues and its singular values.
@todo, also @todo proof
Suppose $A$ is a matrix. @Define the Jordan-Wieldant matrix.
The Jordan-Wieldant matrix associated with a matrix $A$ is
\[\begin{bmatrix}
0 & A \\
A^\top & 0
\end{bmatrix}\]
@State a result concerning its eigenvalues.
This matrix has eigenvalues $\pm \sigma _ i(A)$ and $m - n$ copies of eigenvalues at $0$, and it has eigenvector matrix
\[\begin{bmatrix} U & U & U _ \perp \\ V & -V & 0 \end{bmatrix}\]