15  Runge-Kutta Methods

Runge-Kutta methods generalize numerical integration by polynomial interpolation. Unlike series methods, they do not require calculating extra derivatives.

Indeed, by the fundamental theorem of calculus, we know \[x(t+\Delta t) = x(t) + \int_t^{t+\Delta t} x'(\tau) d\tau\] from usual differential equation \[x'(t) = f(x,t).\]

15.1 Second Order

Let’s work out the linear case for a differential equation \[x'(t) = f(x,t).\]

We can substitute it into the above to obtain \[x(t+\Delta t) = x(t) + \int_t^{t+\Delta t} f(x,\tau) d\tau.\]

The trapezoid rule would estimate the integral \[\int_t^{t+\Delta t} f(x,\tau) d\tau \approx \frac {f(x(t),t) + f(x(t+\Delta t),t+\Delta t)}2 \Delta t,\] but we immediately see an issue: this requires \(x(t+\Delta t)\), which is exactly the value we’re trying to find!

To get around this, we will estimate \(x(t+\Delta t)\) using the BLA – Euler’s method – and then use that estimate in the expression above: \[x(t + \Delta t) \approx x(t) + x'(t)\Delta t = x(t) + f(x,t)\Delta t,\] resulting in a final estimate

\[\begin{align*} \int_t^{t+\Delta t} f(x,\tau) d\tau &\approx \frac {f(x(t),t) + f(x(t+\Delta t),t+\Delta t)}2\\ &\approx\frac {f(x(t),t) + f(x(t) + f(x(t),t)\Delta t,t+\Delta t)}2 \end{align*}\]

This is typically written in the following form: \[\begin{align*} F_1 &= f(x,t)\Delta t\\ F_2 &= f(x+F_1, t+\Delta t) \Delta t\\ x(t+\Delta t) &\approx x(t) + \frac{F_1 + F_2}{2} \end{align*}\]

The name “second order” is in reference to the error: the local error from \(t\) to \(t+\Delta t\) is on the order of \(\Delta t^3\), so the expected global error from the initial condition to the value you’re looking for is on the order of \(\Delta t^2\).

15.2 Higher Order

In general, the scheme is as follows:

  • Select nodes \(\tilde t_k\) in \((t,t+\Delta t)\).
  • Estimate each \(x(\tilde t_k)\) by applying Euler’s method (or even a Taylor method). Call that approximation \(\tilde x_k\).
  • Interpolate the points \((\tilde t_k,f(\tilde x_k,\tilde t_k)\) by a polynomial \(P\).
  • Integrate \(\int_t^{t+\Delta t} P(\tau)d\tau\).

When the spacing of the nodes is the same, one can replace the interpolation and integration step by quadrature. The formulas you see in books and online are all of this form – the “Butcher Tableau” which appear in the most general forms of the Runge-Kutta method essentially summarize a quadrature scheme.

The most common version is 4th order Runge-Kutta, usually abbreviated RK4. The formula for the step \(t\to t+\Delta t\) for \(x'(t) = f(x,t)\) is as follows:

\[\begin{align*} f_1 &= f(x,t)\\ f_2 &= f\left(x + f_1 \frac{\Delta t}{2}, t + \frac{\Delta t}{2}\right)\\ f_3 &= f\left(x + f_2\frac{\Delta t}{2}, t + \frac{\Delta t}{2}\right)\\ f_4 &= f(x + f_3 \Delta t, t + \Delta t) \end{align*}\]

and then

\[x(t+\Delta t) \approx x(t) + (f_1 + 2f_2 + 2f_3 + f_4) \frac{\Delta t}{3!}.\]

Compare this to Simpson’s rule! When \(f\) does not depend on \(x\), you will get exactly this expression.

Going to higher order seems rare, but different quadratures aren’t uncommon. The most general expression for fourth order is

\[\begin{align*} f_1 &= f(x,t)\\ f_2 &= f\left(x + \beta_1 f_1, t + \alpha_1 \Delta t\right)\\ f_3 &= f\left(x + \beta_2 f_1 + \beta_3 f_2\frac{\Delta t}{2}, t + \alpha_2 \Delta t\right)\\ f_4 &= f(x + \beta_4 f_1 + \beta_5 f_2 + \beta_6 f_3 \Delta t, t + \alpha_3 \Delta t) \end{align*}\]

\[x(t+\Delta t) \approx x(t) + (\gamma_1f_1 + \gamma_2f_2 + \gamma_3f_3 + \gamma_4f_4) \frac{\Delta t}{3!},\]

subject to some rather intricate constraints on the \(\alpha_i\), $_i, \(\gamma_i\) which are essentially imposed by how they arise from a concrete choice of nodes and the attendant interpolating polynomial/quarature. For example, \(\sum \gamma_i = 6\) but not all the constraints are linear.