We are interested in computing few smallest eigenvalues and their corresponding eigenvectors of a large symmetric positive definite matrix A, which arises as a result of finite element approximations of linear elasticity problems. Typically, the matrix A is large and sparse. To be able to solve the problem with a reasonable computing time one must use optimal (or nearly optimal) techniques, which has actively developed in last two decades.
The usual methods for solving eigenvalue problems are often based on the effect of excitation of the smallest eigenvalues by repeated multiplication of the inverse matrix on a vector. This applies to such popular techniques as a subspace iteration, the Rayleigh quotient and the Lanczos method. However, in the large--scale finite element problems, it is often desirable to avoid a costly inversion or, to be more precise, exact factorization of the matrix A. The simplest way is to use some preconditioned iterative procedure instead of the direct method for solving the linear system with A whenever it is required in the algorithm. In particular, multigrid methods allow us to construct optimal preconditioners for sufficiently wide number of industrial applications.
In the early eightieths the direct application of a multigrid technique for solving the partial eigenvalue problem of computing few smallest eigenvalues and their corresponding eigenvectors of a large symmetric positive definite matrix has been proposed by Brandt, McCormick and Ruge. This method solves the eigenvalue problems on the sequence of nested grids using an interpolant of the solution on each grid as the initial guess for the next one and improving it by the Full Approximation Scheme (FAS) applied as an inner nonlinear multigrid method.
In the present study the experimental investigation of this method on standard linear elasticity test problems is done. Based on this results we give some practical advices for an optimal choice of user-defined parameters.