Notes - Numerical Analysis HT24, LU factorisation and Gaussian elimination


Flashcards

Suppose we have a system of equations

\[Lx = b\]

where $L$ is a lower-triangular matrix. What is the process of solving the system of equations just using substitutions called in this case?


Forward substitution.

What is meant by the “pivot” in Gaussian elimination?


The diagonal entry $a _ {jj}$ which is used to annihilate everything below it.

Suppose we have a matrix $A$ on which we are performing Gaussian elimination. We have chosen a pivot $a _ {jj}$ and are trying to annihilate row $i$. What is meant by the multiplier in this case?


\[l_j = \frac{a_{ij}\\,}{a_{jj}\\,}\]

Gaussian elimination can fail when a pivot is $0$. What is the idea of partial pivoting when creating zeros in the $j$-th column?


Find

\[|a_{kj}| = \max(|a_{jj}|, |a_{j+1 j}|, \cdots, |a_{nj}|)\]

and swap rows $i$ and $j$.

When performing Gaussian elimination, partial pivoting is the idea of swapping the rows of the matrix in order to put the maximum possible pivot on the diagonal, i.e.

\[|a_{kj}| = \max(|a_{jj}|, |a_{j+1 j}|, \cdots, |a_{nj}|)\]

Do you do this when the current diagonal element is zero, or at every step?


At every step.

When does a matrix fail to have a $LU$ factorisation, and what factorisation does it have in this case?


If any of the pivots are $0$, we have to premultiply by a permutation matrix in order to ensure we don’t divide by zero, i.e.

\[PA = LU\]

But if all possible pivots at a given step are $0$, then it might still fail to have an LU factorisation.

What is the $LU$ factorisation of a square matrix $A$?


\[A = LU\]

such that $L$ is lower triangular and $U$ is upper triangular.

How can you use $LU$ factorisation to solve a system of equations

\[Ax = b\]

?


Factorise $A = LU$. Then solve $Ly = b$, and then $Ux = y$ to get a solution to the original equation.

How can you compute the $LU$ factorisation of a matrix?


Perform Gaussian elimination while keeping track of the row operations performed in a matrix.

Suppose we have the $LU$ factorisation of a matrix $A$. $L$ is lower-triangular and $U$ is lower-triangular, but what additional property do we have for $L$?


It has $1$s on the diagonal.

When performing Gaussian elimination, what matrix achieves the operation of

\[\text{row }i \leftarrow \text{row } i + \lambda\cdot \text{row }j\]

?


\[M(i, j, \lambda) = I + \lambda E_{ij}\]

where $E _ {ij}$ is the matrix with a $1$ in position $(i, j)$ and $0$ elsewhere.

Suppose we had the matrix

\[\left(\begin{matrix} 2 & 1 & 1 \\\\ 1 & 0 & 2 \\\\ 1 & 1 & -4 \end{matrix}\right)\]

When calculating the $LU$ factorisation, what does the first bit of the $L$ matrix look like, which effectively clears the $1$s from the first column, and why do you have to be careful about the signs here?


\[\left(\begin{matrix} 1 & 0 & 0 \\\\ \frac{1}{2} & 1 & 0 \\\\ \frac{1}{2} & 0 & 1 \end{matrix}\right)\]

Note that it’s $+\frac 1 2$, not $-\frac 1 2$, even though we’re subtracting and not adding. This is because the $LU$ factorisation is “creating” the matrix, not “destroying” it.

What is the flop cost of forward/backward substituion on an $n \times n$ lower/upper triangular system (including the dominant term)?::

\[n^2 + O(n)\]

Quick justification: the first row requires $1$ flop, the second row requires $3$ flops, etc, and the $i$-th row requires $2i-1$ flops. Summing up

\[\sum^n_{i = 1} (2i-1) = (2\sum^n_{i = 1} i) - n = n^2 + O(n)\]

What is the flop cost of Gaussian elimination on a $n \times n$ system (including the dominant term)?


\[\frac 2 3 n^3 + O(n^2)\]

Quick justification: Gaussian successively introduces $0$s into the 1st column, then the 2nd, etc, all the way up to the $n-1$st column (since there are no zeroes we need to introduce there).

Introducing $0$s into the $j$st column requires $n-j$ row operations, and each row operation

\[\text{row }i \leftarrow \text{row }i - \frac{a_{ij} }{a_{jj} } \text{row} j\]

calculates for each column $k = j+1, j+2, \cdots, n$:

\[a_{ik} \leftarrow a_{ik} - \frac{a_{ij} }{a_{jj} } a_{jk}\]

which is approximately $2(n-j)$ flops (we only need to calculate the division once). So the cost of GE is approximately

\[\sum^{n-1}_{j = 1} 2(n-j)^2 = \frac{2n(n-1)(2n-1)}{6} = \frac 2 3 n^3 + O(n^2)\]

Quickly prove that, if it exists, the LU decomposition of an invertible matrix $A \in \mathbb R^{n \times n}$ is unique (under the assumption that the diagonal entries of $L$ are $1$).


Suppose $A = L _ 1 U _ 1$ and $A = L _ 2 U _ 2$. Then

\[L_2^{-1} L_1 = U_2 U_1^{-1}\]

Then note that both matrices must have diagonal entries $1$, and are simultaneously upper and lower triangular. Hence they must equal the identity.

What’s the idea behind proving the LU factorisation cannot fail if $A$ is nonsingular?


Consider the determinant of $A$, and relate it to the submatrix under consideration. The facct $\det A \ne 0$ implies that there exists some pivot that is nonzero.




Related posts