We describe new algorithms of the Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) Method for symmetric eigenvalue problems, based on a local optimization of a three-term recurrence. To be able to compare numerically different methods in the class, with different preconditioners, we suggest a common system of model tests, using random preconditioners and initial guesses. As the "ideal'' control algorithm, we propose the standard preconditioned conjugate gradient method for finding an eigenvector as an element of the null-space of the corresponding homogeneous system of linear equations under the assumption that the eigenvalue is known. We recommend that every new preconditioned eigensolver be compared with this ``ideal'' algorithm on our model test problems in terms of the speed of convergence, costs of every iterations and memory requirements. We provide such comparison for our LOBPCG Method. Numerical results establish that our algorithm is practically as efficient as the ideal algorithm when the same preconditioner is used in both methods. We also show numerically that the LOBPCG Method provides approximations to first eigenpairs of about the same quality as those by the much more expensive global optimization method on the same generalized block Krylov subspace. Finally, direct numerical comparisons with the Jacobi-Davidson method show that our method is more robust and converges almost two times faster.
A MATLAB code of the LOBPCG method and the Preconditioned Eigensolvers Benchmarking are available at http://www-math.cudenver.edu/~aknyazev/software/CG/
The talk is based on the paper: "Toward the Optimal Preconditioned Eigensolver: Locally Optimal Block Preconditioned Conjugate Gradient Method." Published as a technical report UCD-CCM 149, 2000, at the Center for Computational Mathematics, University of Colorado at Denver. Submitted to SIAM.