Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks Trained with the Logistic Loss - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2020

Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks Trained with the Logistic Loss

Abstract

Neural networks trained to minimize the logistic (a.k.a. cross-entropy) loss with gradient-based methods are observed to perform well in many supervised classification tasks. Towards understanding this phenomenon, we analyze the training and generalization behavior of infinitely wide two-layer neural networks with homogeneous activations. We show that the limits of the gradient flow on exponentially tailed losses can be fully characterized as a max-margin classifier in a certain non-Hilbertian space of functions. In presence of hidden low-dimensional structures, the resulting margin is independent of the ambiant dimension, which leads to strong generalization bounds. In contrast, training only the output layer implicitly solves a kernel support vector machine, which a priori does not enjoy such an adaptivity. Our analysis of training is non-quantitative in terms of running time but we prove computational guarantees in simplified settings by showing equivalences with online mirror descent. Finally, numerical experiments suggest that our analysis describes well the practical behavior of two-layer neural networks with ReLU activation and confirm the statistical benefits of this implicit bias.
Fichier principal
Vignette du fichier
chizat20Main.pdf (1.18 Mo) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-02473847 , version 1 (11-02-2020)
hal-02473847 , version 2 (20-02-2020)
hal-02473847 , version 3 (03-03-2020)
hal-02473847 , version 4 (19-06-2020)

Identifiers

Cite

Lenaic Chizat, Francis Bach. Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks Trained with the Logistic Loss. COLT 2020 - 33rd Annual Conference on Learning Theory, Jul 2020, Graz / Virtual, Austria. pp.1305-1338. ⟨hal-02473847v4⟩
2754 View
261 Download

Altmetric

Share

Gmail Mastodon Facebook X LinkedIn More