Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Preprints, Working Papers, ... Year : 2019

Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance

Abstract

We consider learning methods based on the regularization of a convex empirical risk by a squared Hilbertian norm, a setting that includes linear predictors and non-linear predictors through positive-definite kernels. In order to go beyond the generic analysis leading to convergence rates of the excess risk as $O(1/\sqrt{n})$ from $n$ observations, we assume that the individual losses are self-concordant, that is, their third-order derivatives are bounded by their second-order derivatives. This setting includes least-squares, as well as all generalized linear models such as logistic and softmax regression. For this class of losses, we provide a bias-variance decomposition and show that the assumptions commonly made in least-squares regression, such as the source and capacity conditions, can be adapted to obtain fast non-asymptotic rates of convergence by improving the bias terms, the variance terms or both.
Fichier principal
Vignette du fichier
main_arxiv.pdf (455.36 Ko) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-02011895 , version 1 (08-02-2019)
hal-02011895 , version 2 (13-02-2019)
hal-02011895 , version 3 (17-06-2019)

Identifiers

Cite

Ulysse Marteau-Ferey, Dmitrii M. Ostrovskii, Francis Bach, Alessandro Rudi. Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance. 2019. ⟨hal-02011895v3⟩
333 View
507 Download

Altmetric

Share

Gmail Mastodon Facebook X LinkedIn More