Leveraging Local Variation in Data: Sampling and Weighting Schemes for Supervised Deep Learning - Inria - Institut national de recherche en sciences et technologies du numérique
Article Dans Une Revue Journal of Machine Learning for Modeling and Computing Année : 2022

Leveraging Local Variation in Data: Sampling and Weighting Schemes for Supervised Deep Learning

Résumé

In the context of supervised learning of a function by a neural network, we claim and empirically verify that the neural network yields better results when the distribution of the data set focuses on regions where the function to learn is steep. We first traduce this assumption in a mathematically workable way using Taylor expansion and emphasize a new training distribution based on the derivatives of the function to learn. Then, theoretical derivations allow construction of a methodology that we call variance based samples weighting (VBSW). VBSW uses labels' local variance to weight the training points. This methodology is general, scalable, cost-effective, and significantly increases the performances of a large class of neural networks for various classification and regression tasks on image, text, and multivariate data. We highlight its benefits with experiments involving neural networks from linear models to ResNet and BERT.
Fichier principal
Vignette du fichier
VBSW_JMLMC.pdf (910.45 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02885827 , version 1 (01-07-2020)
hal-02885827 , version 2 (19-01-2021)
hal-02885827 , version 3 (28-01-2021)
hal-02885827 , version 4 (27-09-2022)

Identifiants

Citer

Paul Novello, Gaël Poëtte, David Lugato, Pietro Marco Congedo. Leveraging Local Variation in Data: Sampling and Weighting Schemes for Supervised Deep Learning. Journal of Machine Learning for Modeling and Computing, 2022, 3 (1), ⟨10.1615/JMachLearnModelComput.2022041819⟩. ⟨hal-02885827v4⟩
383 Consultations
372 Téléchargements

Altmetric

Partager

More