Convergence Analysis of Block Majorize-Minimize Subspace Approach
Résumé
We consider the minimization of a differentiable Lipschitz gradient but non necessarily convex, function F defined on R N. We propose an accelerated gradient descent approach which combines three strategies, namely (i) a variable metric derived from the majorization-minimization principle ; (ii) a subspace strategy incorporating information from the past iterates ; (iii) a block alternating update. Under the assumption that F satisfies the Kurdyka-Łojasiewicz property, we give conditions under which the sequence generated by the resulting block majorize-minimize subspace algorithm converges to a critical point of the objective function, and we exhibit convergence rates for its iterates.
Origine | Fichiers produits par l'(les) auteur(s) |
---|