Modified Richardson iteration

From Wikipedia, the free encyclopedia

Modified Richardson iteration is an iterative method for solving a system of linear equations. It is similar to the Jacobi and Gauss–Seidel method.

We seek the solution to a set of linear equations, expressed in matrix terms as

 A x = b.\,

The Richardson iteration is

 
x^{(k+1)}  = x^{(k)} + \omega \left( b - A x^{(k)} \right),

where ω > 0 is a parameter that has to be chosen such that \omega < 2/\varrho(A), where \varrho(A) is the spectral radius of A.

It is easy to see that the method is correct, because if it converges, i.e if x(k + 1) = x(k), then x(k) has to be a solution of Ax = b.

[edit] Convergence

Suppose that A is diagonalizable and that j,vj) are the eigenvalues and eigenvectors of A. If we express b, x(k) and the correct solution x^{\ast} in the eigenvector basis


b = \sum_j b_j v_j , \quad \quad x^{(k)} = \sum_j x_j^{(k)} v_j \quad \text{ and } \quad x^{\ast} = \sum_j x_j^{\ast} v_j ,

then we can rewrite the iteration as


x_j^{(k+1)} = x_j^{(k)} + \omega ( b_j - \lambda_j x_j^{(k)} ) = (1 - \omega \lambda_j) x_j^{(k)} +  \omega b_j .

With \lambda_j x_j^{\ast} = b_j we can get the following formula for the error of each component


x_j^{(k+1)} -x_j^{\ast} =  (1 - \omega \lambda_j) x_j^{(k)} + \omega \lambda_j x_j^{\ast} - x_j^{\ast} = (1 - \omega \lambda_j) (x_j^{(k)} - x_j^{\ast}) .

This error converges to 0 if | 1 − ωλj | < 1 for all eigenvalues λj. This can be guaranteed if ω is chosen such that 0 < \omega < 2/\varrho(A).