Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation
Résumé
We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton's method for self-concordant functions, including the case of inexact search directions. The analysis uses semidefinite programming performance estimation, as pioneered by Drori and Teboulle [it Math. Program., 145 (2014), pp. 451--482], and extends recent performance estimation results for the method of Cauchy by the authors [it Optim. Lett., 11 (2017), pp. 1185--1199]. To illustrate the applicability of the tools, we demonstrate a novel complexity analysis of short step interior point methods using inexact search directions. As an example in this framework, we sketch how to give a rigorous worst-case complexity analysis of a recent interior point method by Abernethy and Hazan [it PMLR, 48 (2016), pp. 2520--2528].
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...