On the Inductive Bias of Neural Tangent Kernels - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2019

On the Inductive Bias of Neural Tangent Kernels

Résumé

State-of-the-art neural networks are heavily over-parameterized, making the optimization algorithm a crucial ingredient for learning predictive models with good generalization properties. A recent line of work has shown that in a certain over-parameterized regime, the learning dynamics of gradient descent are governed by a certain kernel obtained at initialization, called the neural tangent kernel. We study the inductive bias of learning in such a regime by analyzing this kernel and the corresponding function space (RKHS). In particular, we study smoothness, approximation, and stability properties of functions with finite norm, including stability to image deformations in the case of convolutional networks, and compare to other known kernels for similar architectures.
Fichier principal
Vignette du fichier
main.pdf (363.7 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02144221 , version 1 (29-05-2019)
hal-02144221 , version 2 (25-10-2019)

Identifiants

Citer

Alberto Bietti, Julien Mairal. On the Inductive Bias of Neural Tangent Kernels. NeurIPS 2019 - Thirty-third Conference on Neural Information Processing Systems, Dec 2019, Vancouver, Canada. pp.12873-12884. ⟨hal-02144221v2⟩
410 Consultations
1077 Téléchargements

Altmetric

Partager

More