Non-Vacuous Generalisation Bounds for Shallow Neural Networks - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Non-Vacuous Generalisation Bounds for Shallow Neural Networks

Résumé

We focus on a specific class of shallow neural networks with a single hidden layer, namely those with $L_2$-normalised data and either a sigmoid-shaped Gaussian error function ("erf") activation or a Gaussian Error Linear Unit (GELU) activation. For these networks, we derive new generalisation bounds through the PAC-Bayesian theory; unlike most existing such bounds they apply to neural networks with deterministic rather than randomised parameters. Our bounds are empirically non-vacuous when the network is trained with vanilla stochastic gradient descent on MNIST and Fashion-MNIST.
Fichier principal
Vignette du fichier
2202.01627.pdf (447.46 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03557415 , version 1 (14-02-2022)

Identifiants

Citer

Felix Biggs, Benjamin Guedj. Non-Vacuous Generalisation Bounds for Shallow Neural Networks. 2022. ⟨hal-03557415⟩
33 Consultations
34 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More