MtxVec VCL
|
Solve the system A*X = B by using one of the iterative methods.
The term "iterative method" refers to a wide range of techniques that use successive approximations to obtain more accurate solutions to a linear system at each step. There are two major groups of iterative methods: stationary and nonstationary methods. Stationary methods are older, simpler to understand and implement, but usually not as effective. Nonstationary methods are a relatively recent development; their analysis is usually harder to understand, but they can be highly effective. The nonstationary methods are based on the idea of sequences of orthogonal vectors.
The rate at which an iterative method converges depends greatly on the spectrum of the coefficient matrix. Hence, iterative methods usually involve a second matrix that transforms the coefficient matrix into one with a more favorable spectrum. The transformation matrix is called a pre conditioner. A good precondition improves the convergence of the iterative method, sufficiently to overcome the extra cost of constructing and applying the pre conditioner. Indeed, without a pre conditioner the iterative method may even fail to converge.
Non stationary methods:
Conjugate Gradient (CG)
The conjugate gradient method derives its name from the fact that it generates a sequence of conjugate (or orthogonal) vectors. These vectors are the residuals of the iterates. They are also the gradients of a quadratic functional, the minimization of which is equivalent to solving the linear system. CG is an extremely effective method when the coefficient matrix is symmetric positive definite, since storage for only a limited number of vectors is required.
Conjugate Gradient on the Normal Equations
These methods are based on the application of the CG method to one of two forms of the normal equations for Ax = b. CGNE solves the system (AA T )y = b for y and then computes the solution x = A'y. When the coefficient matrix A is nonsymmetric and nonsingular, the normal equations matrices AA' and A'A will be symmetric and positive definite, and hence CG can be applied. The convergence may be slow, since the spectrum of the normal equations matrices will be less favorable than the spectrum of A.
Generalized Minimal Residual (GMRES)
The Generalized Minimal Residual method computes a sequence of orthogonal vectors (like minimum residual), and combines these through a least-squares solve and update. However, unlike MINRES (and CG) it requires storing the whole sequence, so that a large amount of storage is needed. For this reason, restarted versions of this method are used. In restarted versions, computation and storage costs are limited by specifying a fixed number of vectors to be generated. This method is useful for general nonsymmetric matrices.
BiConjugate Gradient (BiCG)
The Biconjugate Gradient method generates two CG-like sequences of vectors, one based on a system with the original coefficient matrix A, and one on A T . Instead of orthogonalizing each sequence, they are made mutually orthogonal, or "bi-orthogonal". This method, like CG, uses limited storage. It is useful when the matrix is nonsymmetric and nonsingular; however, convergence may be irregular, and there is a possibility that the method will break down. BiCG requires a multiplication with the coefficient matrix and with its transpose at each iteration.
Stationary methods: (converge very slow and are provided only as a reference)
Jacobi
The Jacobi method is based on solving for every variable locally with respect to the other variables; one iteration of the method corresponds to solving for every variable once. The resulting method is easy to understand and implement, but convergence is slow.
Gauss-Seidel
The Gauss-Seidel method is like the Jacobi method, except that it uses updated values as soon as they are available. In general, if the Jacobi method converges, the Gauss-Seidel method will converge faster than the Jacobi method, though still relatively slowly.
Reference: Templates for the Solution of Linear Systems: Building Blocks for Iterative Methods 1, R. Barrett, M. Berry, T. F. Chan, J. Demmel, J. Donato, J. Dongarra, V. Eijkhout, R. Pozo, C. Romine, and H. Van der Vorstg , SIAM, 1994, Philadelphia, PA.
Copyright (c) 1999-2025 by Dew Research. All rights reserved.
|
What do you think about this topic? Send feedback!
|