A solution of the finite difference equations
can be obtained iteratively. An initial value for the
field ui,j is guessed for all interior points
in the domain. The value of
the field at the grid point i and j is solved
assuming that the field at the other grid locations is known.
(14) |
The Jacobi method uses a new ``array'' to store the new value of the field ui,j at the interior grid points.
The Gauss-Siedel method consists in replacing the new value of the
field within the same ``array''. This corresponds to the
following form:
(15) |
The iterations are repeated until convergence. This is checked by
calculating the largest change between the old and new field
at all interior points in the domain at each iteration
(16) |
This numerical solution is unconditionally stable, i.e., it will converge to a solution. The accuracy of the solution is a separate issue; it depends on the adequacy of the numerical grid to describe the regions of high curvature of the field and on the ability of the numerical grid to cover the domain.
The Gauss-Siedel method converges faster than the Jacobi method. Yet
it is a notoriously slow method, often requiring very large number
of iterations to achieve convergence. A better rate of convergence
is achieved by the Succesive Over Relaxation
(SOR) method. In this approach, the old and new fields are
mixed via a parameter .
(17) |
(18) |
may change as a function of iteration number. It is taken to be small for few iterations, increased to a value close to 1 for many iterations, and eventually taken larger than 1 to ``accelerate'' convergence in the latter stages of the iteration process. The choice of the best value for is discussed in ``Numerical Recipes'' in the section on SOR.
Based on notes by Guillermo Marshall, University of Buenos Aires