NLA MT25, Krylov subspace methods


Flashcards

Suppose we are solving linear systems $Ax = b$ or eigenvalue problems $Ax = \lambda x$. What is the general idea of Krylov subspace methods?


Look for approximate solutions $\hat x \in \mathcal K _ {r}(A, v)$, where

\[\mathcal K _ r(A, v) = \text{span}(v, Av, \ldots, A^{r-1} v)\]

where $v \in \mathbb R^n$, or equivalently $\hat x = p _ {r-1}(A)v$, where $p \in \mathbb C _ {r-1}[t]$ is a degree $r-1$ polynomial.

@Define the order-$r$ Krylov subspace $\mathcal K _ r(A, v)$.


\[\mathcal K _ r(A, v) = \text{span}(v, Av, \ldots, A^{r-1} v)\]

Orthonormal bases of Krylov subspaces

This is worth considering because orthonormal bases of Krylov subspaces are very useful in iterative algorithms, such as the [[Notes - NLA MT25, GMRES algorithm]]U. A better approach than using a generic approach applied to ${v, Av, \ldots, A^{r-1}v}$ is given in [[Notes - NLA MT25, Arnoldi iteration]]U.

The order-$r$ Krylov subspace $\mathcal K _ r(A, v)$. Orthonormal bases of these subspaces turn out to be very useful in iterative algorithms. What’s a naïve way of computing an orthonormal basis of $\mathcal K _ r(A, v)$, and why is this naïve?


Form the matrix

\[K = \Bigg[ v \Bigg \vert Av \Bigg \vert \cdots \Bigg \vert A^{r-1} v \Bigg]\]

and orthogonalise, e.g. by finding the QR decomposition $K = QR$. This is naïve, because $K$ is typically very poorly conditioned and so $Q$ will be ill-conditioned.




Related posts