On the symmetries in the dynamics of wide two-layer neural networks - Inria EPFL Access content directly
Preprints, Working Papers, ... Year : 2022

On the symmetries in the dynamics of wide two-layer neural networks

Abstract

We consider the idealized setting of gradient flow on the population risk for infinitely wide two-layer ReLU neural networks (without bias), and study the effect of symmetries on the learned parameters and predictors. We first describe a general class of symmetries which, when satisfied by the target function $f^*$ and the input distribution, are preserved by the dynamics. We then study more specific cases. When $f^*$ is odd, we show that the dynamics of the predictor reduces to that of a (non-linearly parameterized) linear predictor, and its exponential convergence can be guaranteed. When $f^*$ has a low-dimensional structure, we prove that the gradient flow PDE reduces to a lower-dimensional PDE. Furthermore, we present informal and numerical arguments that suggest that the input neurons align with the lower-dimensional structure of the problem.
Fichier principal
Vignette du fichier
learning_features_HAL_v2.pdf (890.38 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03829400 , version 1 (15-11-2022)
hal-03829400 , version 2 (24-11-2022)
hal-03829400 , version 3 (06-02-2023)
hal-03829400 , version 4 (08-02-2023)

Identifiers

Cite

Karl Hajjar, Lenaic Chizat. On the symmetries in the dynamics of wide two-layer neural networks. 2022. ⟨hal-03829400v3⟩
188 View
58 Download

Altmetric

Share

Gmail Facebook X LinkedIn More