Numerical Analysis HT24, LU decomposition


Flashcards

forward-substitution-definition

Suppose we have a system of equations \(Lx = b\) where $L$ is a lower-triangular matrix. What is the process of solving the system of equations just using substitutions called in this case?


Forward substitution.

pivot-definition

What is meant by the “pivot” in Gaussian elimination?


The diagonal entry $a _ {jj}$ which is used to annihilate everything below it.

multiplier-definition

Suppose we have a matrix $A$ on which we are performing Gaussian elimination. We have chosen a pivot $a _ {jj}$ and are trying to annihilate row $i$. What is meant by the multiplier in this case?


\[l _ j = \frac{a _ {ij}\\,}{a _ {jj}\\,}\]
partial-pivoting-idea

Gaussian elimination can fail when a pivot is $0$. What is the idea of partial pivoting when creating zeros in the $j$-th column?


Find \(\vert a _ {kj} \vert = \max( \vert a _ {jj} \vert , \vert a _ {j+1 j} \vert , \cdots, \vert a _ {nj} \vert )\) and swap rows $i$ and $j$.

partial-pivoting-every-step

When performing Gaussian elimination, partial pivoting is the idea of swapping the rows of the matrix in order to put the maximum possible pivot on the diagonal, i.e. \(\vert a _ {kj} \vert = \max( \vert a _ {jj} \vert , \vert a _ {j+1 j} \vert , \cdots, \vert a _ {nj} \vert )\) Do you do this when the current diagonal element is zero, or at every step?


At every step.

when-lu-fails

When does a matrix fail to have a $LU$ factorisation, and what factorisation does it have in this case?


If any of the pivots are $0$, we have to premultiply by a permutation matrix in order to ensure we don’t divide by zero, i.e. \(PA = LU\) But if all possible pivots at a given step are $0$, then it might still fail to have an LU factorisation.

lu-factorisation-definition-na

What is the $LU$ factorisation of a square matrix $A$?


\(A = LU\) such that $L$ is lower triangular and $U$ is upper triangular.

solving-via-lu

How can you use $LU$ factorisation to solve a system of equations \(Ax = b\) ?


Factorise $A = LU$. Then solve $Ly = b$, and then $Ux = y$ to get a solution to the original equation.

computing-lu-via-gaussian-elimination

How can you compute the $LU$ factorisation of a matrix?


Perform Gaussian elimination while keeping track of the row operations performed in a matrix.

l-has-unit-diagonal

Suppose we have the $LU$ factorisation of a matrix $A$. $L$ is lower-triangular and $U$ is lower-triangular, but what additional property do we have for $L$?


It has $1$s on the diagonal.

row-operation-matrix

When performing Gaussian elimination, what matrix achieves the operation of \(\text{row }i \leftarrow \text{row } i + \lambda\cdot \text{row }j\) ?


\(M(i, j, \lambda) = I + \lambda E _ {ij}\) where $E _ {ij}$ is the matrix with a $1$ in position $(i, j)$ and $0$ elsewhere.

lu-sign-care-example

Suppose we had the matrix \(\left(\begin{matrix} 2 & 1 & 1 \\\\ 1 & 0 & 2 \\\\ 1 & 1 & -4 \end{matrix}\right)\) When calculating the $LU$ factorisation, what does the first bit of the $L$ matrix look like, which effectively clears the $1$s from the first column, and why do you have to be careful about the signs here?


\[\left(\begin{matrix} 1 & 0 & 0 \\\\ \frac{1}{2} & 1 & 0 \\\\ \frac{1}{2} & 0 & 1 \end{matrix}\right)\]

Note that it’s $+\frac 1 2$, not $-\frac 1 2$, even though we’re subtracting and not adding. This is because the $LU$ factorisation is “creating” the matrix, not “destroying” it.

What is the flop cost of forward/backward substituion on an $n \times n$ lower/upper triangular system (including the dominant term)?::

\[n^2 + O(n)\]

Quick justification: the first row requires $1$ flop, the second row requires $3$ flops, etc, and the $i$-th row requires $2i-1$ flops. Summing up

\[\sum^n_{i = 1} (2i-1) = (2\sum^n_{i = 1} i) - n = n^2 + O(n)\]
gaussian-elimination-flop-cost

What is the flop cost of Gaussian elimination on a $n \times n$ system (including the dominant term)?


\[\frac 2 3 n^3 + O(n^2)\]

Quick justification: Gaussian elimination successively introduces $0$s into the 1st column, then the 2nd, etc, all the way up to the $n-1$st column (since there are no zeroes we need to introduce there).

Introducing $0$s into the $j$st column requires $n-j$ row operations, and each row operation \(\text{row }i \leftarrow \text{row }i - \frac{a _ {ij} }{a _ {jj} } \text{row} j\) calculates for each column $k = j+1, j+2, \cdots, n$: \(a _ {ik} \leftarrow a _ {ik} - \frac{a _ {ij} }{a _ {jj} } a _ {jk}\) which is approximately $2(n-j)$ flops (we only need to calculate the division once). So the cost of GE is approximately \(\sum^{n-1} _ {j = 1} 2(n-j)^2 = \frac{2n(n-1)(2n-1)}{6} = \frac 2 3 n^3 + O(n^2)\)

lu-uniqueness-proof-na

@Prove that, if it exists, the LU decomposition of an invertible matrix $A \in \mathbb R^{n \times n}$ is unique (under the assumption that the diagonal entries of $L$ are $1$).


Suppose $A = L _ 1 U _ 1$ and $A = L _ 2 U _ 2$. Then \(L _ 2^{-1} L _ 1 = U _ 2 U _ 1^{-1}\) Then note that both matrices must have diagonal entries $1$, and are simultaneously upper and lower triangular. Hence they must equal the identity.

lu-cannot-fail-nonsingular-idea

What’s the idea behind proving the LU factorisation cannot fail if $A$ is nonsingular?


Consider the determinant of $A$, and relate it to the submatrix under consideration. The fact $\det A \ne 0$ implies that there exists some pivot that is nonzero.

Bite-sized

Cost of LU on a square $n \times n$ matrix: $\tfrac{2}{3} n^3$ flops. Forward / back substitution on the resulting triangular systems costs $O(n^2)$ each.

Source: Existing card ^gaussian-elimination-flop-cost in this entry; NLA MT25 §5.1 of the lecture notes.

@bite~

bite-na-pivoting-existence-and-stability

What does partial pivoting buy us — existence, stability, or both?


Both. Without pivoting, LU may simply fail to exist (e.g. $\begin{bmatrix} 0 & 1 \ 1 & 0 \end{bmatrix}$ has no LU). With partial pivoting we get $PA = LU$ for any nonsingular $A$. Pivoting also keeps the multipliers $ \vert l _ {ij} \vert \le 1$, dramatically improving stability — though the worst-case growth factor of $|U|/|A|$ can still be exponential.

Source: NLA MT25 §5.2 and §7.5.2 of the lecture notes.

@bite~

$PA = LU$ exists for any nonsingular $A$ — partial pivoting is sufficient to handle the zero-pivot obstruction.

Source: Existing card ^lu-cannot-fail-nonsingular-idea in this entry; NLA MT25 §5.2 of the lecture notes.

@bite~




Related posts