Enlarged Krylov Subspace Conjugate Gradient Methods for Reducing Communication - Inria - Institut national de recherche en sciences et technologies du numérique
Journal Articles SIAM Journal on Matrix Analysis and Applications Year : 2016

Enlarged Krylov Subspace Conjugate Gradient Methods for Reducing Communication

Abstract

In this paper we introduce a new approach for reducing communication in Krylov subspace methods that consists of enlarging the Krylov subspace by a maximum of $t$ vectors per iteration, based on a domain decomposition of the graph of $A$. The obtained enlarged Krylov subspace $\mathscr{K}_{k,t}(A,r_0)$ is a superset of the Krylov subspace $\mathcal{K}_k(A,r_0)$, $\mathcal{K}_k(A,r_0) \subset \mathscr{K}_{k,t}(A,r_0)$. Thus, we search for the solution of the system $Ax=b$ in $\mathscr{K}_{k,t}(A,r_0)$ instead of $\mathcal{K}_k(A,r_0)$. Moreover, we show in this paper that the enlarged Krylov projection subspace methods lead to faster convergence in terms of iterations and parallelizable algorithms with less communication, with respect to Krylov methods.

Dates and versions

Identifiers

Cite

Laura Grigori, Sophie Moufawad, Frédéric Nataf. Enlarged Krylov Subspace Conjugate Gradient Methods for Reducing Communication. SIAM Journal on Matrix Analysis and Applications, 2016, 37 (2), pp.744-773. ⟨10.1137/140989492⟩. ⟨hal-01357899⟩
122 View
0 Download

Altmetric

Share

More