Truncation Strategies for (Nested) Krylov Methods
Eric de Sturler
Swiss Center for Scientific Computing (SCSC-ETHZ)
Swiss Federal Institute of Technology Zurich
ETH Zentrum, RZ F-11
CH-8092 Zurich, Switzerland
In optimal methods based on the Arnoldi iteration, if a large number
of iterations is necessary, the storage requirements become excessive
and the iterations too expensive. Therefore, in practice one restarts
after a given number of iterations, m, (like GMRES(m)), or one truncates
the set of Arnoldi vectors, typically by keeping only the last m vectors
However, this often leads to a loss of superlinear convergence or
even to stagnation of the convergence. Nested iterative methods
 and/or more general truncation schemes offer the possibility of
near optimal convergence with reasonable memory requirements and low
iteration cost, if the right vectors to keep for orthogonalization in
future iterations can be selected.
Recently proposed strategies seem to focus on removing certain parts of the
spectrum. However, these strategies generally assume that the spectrum is in
the positive real half plane, and are therefore not generally applicable.
Moreover, recent papers by Z. Strakos and A. Greembaum show that even
for a favourable spectrum the convergence of full GMRES (which is optimal) can
still be arbitrarily bad (stagnating) depending on the eigenvectors of the
matrix; i.e. the non-normality plays a large role as well.
Therefore, we propose a different strategy.
We compute the singular value decomposition of a small matrix (Z)
constructed from the iteration parameters, that is, from the
Hessenberg matrix in GMRES(m) or from the product of the matrices
that describe the projections in the nested method. This singular
value decomposition reveals exactly what role the orthogonalization on
each arbitrary subspace of the space spanned by the Arnoldi vectors has
played in the convergence so far. Where we use the term Arnoldi vectors
to indicate both the search vectors generated by GMRES or GCR and the
search vectors in the outer iteration if we use a nested method.
To be more precise, if we consider one or more vectors defined as the
products of the matrix with the Arnoldi vectors as its columns and the
left singular vectors of Z, then a function of the corresponding singular
values determines how much worse the convergence would have been had this
vector or the space spanned by these vectors not been projected out in
the previous iterations. This can be generalized to arbitrary subspaces.
So this function of the singular values defines the deterioration from the
optimal convergence. This information can be used to select the dimension
and the basis of the space to be projected out (orthogonalized upon) in
Preliminary numerical tests indicate that this approach leads to a very effective
 E. de Sturler, "Nested Krylov methods based on GCR",
Technical Report 93-50, Faculty of Technical Mathematics and Informatics,
Delft University of Technology, Delft, The Netherlands, 1993.
(accepted Journal of Comp. and Appl. Mathematics, North