Brain Dump

Convergence

Tags
math

Diverging is when a numerical sequence diverges away from a common solution and continually gives bigger and less accurate solutions. Converging is when a sequence stabilises to a fixed number (root value) after a misc number of iterations. Continuing the sequence only provides a better approximation of the root, and we often stop once we reach an accurate value.

For example consider the first few values of the recurrence relation \( x_{n+1} = \frac{x_n^5+3}{5} \) beginning with \( x_0 = -1.5 \).

\begin{align*} x_0 &= -1.5 \
x_1 &= -0.91875 \
x_2 &= 0.46907 \
x_3 &= 0.60454 \
x_4 &= 0.61614 \
x_5 &= 0.61776 \
x_6 &= 0.61799 \
x_7 &= 0.61802 \
x_8 &= 0.61803 \end{align*}

After 8 iterations the value of the sequence converges/stabilises to \( 0.8180 \) to 4 decimal places.

For another example in gradient-descent when the change in the gradient of some value at some point is \( 0 \) we say it's converged to a local-minimum or maximum.