Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition - Inria - Institut national de recherche en sciences et technologies du numérique
Preprints, Working Papers, ... Year : 2013

Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition

Abstract

We consider optimizing a function smooth convex function $f$ that is the average of a set of differentiable functions $f_i$, under the assumption considered by Solodov [1998] and Tseng [1998] that the norm of each gradient $f_i'$ is bounded by a linear function of the norm of the average gradient $f'$. We show that under these assumptions the basic stochastic gradient method with a sufficiently-small constant step-size has an $O(1/k)$ convergence rate, and has a linear convergence rate if $g$ is strongly-convex.
Fichier principal
Vignette du fichier
smallResidual.pdf (79.99 Ko) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-00855113 , version 1 (28-08-2013)

Identifiers

Cite

Mark Schmidt, Nicolas Le Roux. Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition. 2013. ⟨hal-00855113⟩
232 View
1286 Download

Altmetric

Share

More