Notes - Numerical Analysis HT24, Multi-step methods


Flashcards

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

We can split up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$. In this context, what is a linear $k$-step method for approximating $\pmb y(x)$?


Computing the approximation $\pmb y _ {n + k}$ via the equation

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

where $\alpha _ j$ and $\beta _ j$ are real coefficients.

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

In this context, how are the first and second characteristic polynomials $\rho$ and $\sigma$ defined?


\[\begin{aligned} \rho(z) &:= \sum^k_{j = 0} \alpha_j z^j \\\\ \sigma(z) &:= \sum^k_{j = 0} \beta_j z^j \end{aligned}\]

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

By considering Simpson’s rule

\[\int^{x_ 2}_ {x_ 0} f(x) \text d x \approx \frac{x_2 - x_0}{6} \left(f(x_0) + 4f\left(\frac{x_0 + x_2}{2}\right) + f(x_2)\right)\]

Can you derive a $2$-step method and give its characteristic polynomials?


\[\begin{aligned} \pmb y(x_ {n+1}) &= \pmb y(x_ {n - 1}) + \int^{x_ {n+1} }_ {x_ {n-1} } \pmb f(x, \pmb y(x)) \text d x \\\\ &\approx \pmb y(x_ {n-1}) + \frac{2h}{6}\big( \pmb f(x_ {n-1}, \pmb y(x_ {n-1}) + 4\pmb f(x_ n, \pmb y(x_ n)) + \pmb f(x_ {n+1}, \pmb y(x_ {n+1})) \big) \end{aligned}\]

Which motivatives

\[\pmb y_{n+2} - \pmb y_{n} = h \left( \frac 1 3 \pmb f(x_{n}, \pmb y(x_{n}) + \frac 4 3 \pmb f(x_{n+1}, \pmb y(x_{n+1})) + \frac 1 3 \pmb f(x_{n+2}, \pmb y(x_{n+2})) \right)\]

This has characteristic polynomials

\[\begin{aligned} \rho(z) &= z^2 - 1 \\\\ \sigma(z) &= \frac 1 3 (z^2 + 4z + 1) \end{aligned}\]

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

The first and second characteristic polynomials $\rho$ and $\sigma$ then defined

\[\begin{aligned} \rho(z) &:= \sum^k_{j = 0} \alpha_j z^j \\\\ \sigma(z) &:= \sum^k_{j = 0} \beta_j z^j \end{aligned}\]

What is the root condition, and can you briefly quote a theorem that demonstrates why this is useful?


The root condition is that

All roots of $\rho(z)$ lie in the closed unit disc, and any root on the unit circle is simple

We have a theorem (Dahlquist’s equivalence theorem) that says

\[\text{consistency} + \text{root condition} \iff \text{convergence}\]

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

How is the consistency error $\pmb \tau(h)$ of such a method defined?


\[\pmb \tau(h) = \frac{\sum^k_{j = 0} \alpha_j \pmb y(x_j) - h \sum^k_{j = 0} \beta_j \pmb y'(x_j)}{h \sum^k_{j = 0}\beta_j}\]

(we require $\sigma(1) = \sum^k _ {j = 0} \beta _ j \ne 0$).

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

The consistency error $\pmb \tau(h)$ of such a method is defined

\[\pmb \tau(h) = \frac{\sum^k_{j = 0} \alpha_j \pmb y(x_j) - h \sum^k_{j = 0} \beta_j \pmb y'(x_j)}{h \sum^k_{j = 0}\beta_j}\]

(we require $\sigma(1) \ne 0$). Can you give neccessary and sufficient conditions for such a method to be consistent (i.e. we have at least $\pmb \tau(h) = O(h)$)?


\[\rho(1) = 0 \quad \text{and} \quad \rho'(1) = \sigma(1) \ne 0\]

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

The consistency error $\pmb \tau(h)$ of such a method is defined

\[\pmb \tau(h) = \frac{\sum^k_{j = 0} \alpha_j \pmb y(x_j) - h \sum^k_{j = 0} \beta_j \pmb y'(x_j)}{h \sum^k_{j = 0}\beta_j}\]

In this context, can you informally state Dahlquist’s equivalence theorem?


\[\text{consistency} + \text{root condition} \iff \text{convergence}\]

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

In this context, how is the stability polynomial defined (recall that this depends on $z$, which is the “test value” used in the Dahlquist test equation)?


\[\pi(x) = \pi(x; z) := \sum^k_{j = 0} (\alpha_j - z \beta_j) x^j = \rho(x) - z\sigma(x)\]

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

The stability polynomial is then defined

\[\pi(x) = \pi(x; z) := \sum^k_{j = 0} (\alpha_j - z \beta_j) x^j = \rho(x) - z\sigma(x)\]

(this depends on $z$, which is the “test value” used in the Dahlquist test equation). How is the stability domain then defined?


\[S := \\{z \in \mathbb C \mid \pi(x; z) = 0 \implies |x| \le 1, \text{ and if }x\text{ is not a simple root, then }|x| < 1\\}\]

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

The stability polynomial is then defined

\[\pi(x) = \pi(x; z) := \sum^k_{j = 0} (\alpha_j - z \beta_j) x^j = \rho(x) - z\sigma(x)\]

(this depends on $z$, which is the “test value” used in the Dahlquist test equation). The stability domain is then defined

\[S := \\{z \in \mathbb C \mid \pi(x; z) = 0 \implies |x| \le 1, \text{ and if }x\text{ is not a simple root, then }|x| < 1\\}\]

Can you state Dahlquist’s second barrier?


  • If a linear multi-step method is $A$-stable (i.e. $\mathbb C^{-} \subseteq S$), then it must be implicit and have order $p \le 2$ (alternatively, there aren’t any $A$-stable multistep methods which are implicit, and there aren’t any $A$-stable multistep methods which have order $> 2$).
  • The trapezium rule is the second-order $A$-stable linear multistep method with the smallest error constant (i.e. $C$ in $e _ N \le Ch^p$, I think).

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

The stability polynomial is then defined

\[\pi(x) = \pi(x; z) := \sum^k_{j = 0} (\alpha_j - z \beta_j) x^j = \rho(x) - z\sigma(x)\]

(this depends on $z$, which is the “test value” used in the Dahlquist test equation). The stability domain is then defined

\[S := \\{z \in \mathbb C \mid \pi(x; z) = 0 \implies |x| \le 1, \text{ and if }x\text{ is not a simple root, then }|x| < 1\\}\]

What’s the motivation behind these definitions?


On the Dahlquist test equation with step size $1$, the method becomes

\[\sum^k_{j = 0} \alpha_j y_{n + j} = \sum^k_{j = 0} \beta_j z y_{n+j}\]

or equivalently

\[\sum^k_{j = 0} (\alpha_j - \beta_j z) y_{n+j} = 0\]

This is just a recurrence. The stability polynomial is the characteristic polynomial of this recurrence, and it follows that the solution for $y _ n$ is of the form

\[y_n = p_1(n) r^n_1 + \cdots + p_\ell(n) r^n_\ell\]

where the $r _ j$ are the roots of the polynomial $\pi(x; z)$ (the roots in $x$) and $p _ j$ are polynomials of degree $m _ j$ and $m _ j$ is the multiplicity of $r _ j$.

Then:

  • If $\pi(x; z)$ has a zero outside the unit disc, $y _ n$ grows exponentially
  • If any $r _ j$ is on the unit disc and is not simple, then $y _ n \sim n^{m _ j -1}$
  • Otherwise (all roots of $\pi(x; z)$ are in the unit disc), $y _ n$ converges to $0$

So the stability region is the region of values for $z$ for which the approximation $y _ n$ given by the method has the same limiting behaviour as the exact solution to the corresponding Dahlquist test equation.

Suppose we have the IVP:

  • $I = [x _ 0, X] \subset \mathbb R$ be an interval
  • $D \subset \mathbb R^d$ open subset
  • $\pmb y’ (x) = \pmb f(x, \pmb y)$
  • $\pmb y (x _ 0) = \pmb y _ 0$

A linear $k$-step method comes from splitting up the domain via $N, k \in \mathbb N$ and $h := (X - x _ 0)/N$ so that $x _ n = x _ 0 + hn$, and then approximating $\pmb y _ {n + k}$ via

\[\sum^k_{j = 0} \alpha_j \pmb y_{n + j} = h \sum^k_{j = 0} \beta_j \pmb f(x_{n + j}, \pmb y_{n + j})\]

The stability polynomial is then defined

\[\pi(x) = \pi(x; z) := \sum^k_{j = 0} (\alpha_j - z \beta_j) x^j = \rho(x) - z\sigma(x)\]

(this depends on $z$, which is the “test value” used in the Dahlquist test equation). The stability domain is then defined

\[S := \\{z \in \mathbb C \mid \pi(x; z) = 0 \implies |x| \le 1, \text{ and if }x\text{ is not a simple root, then }|x| < 1\\}\]

What does it mean for such a method to be $A$-stable?


\[\mathbb C^- \subseteq S\]

(the same as for Runge-Kutta methods).




Related posts