Universality of the mean-field equations of networks of Hopfield-like neurons
Résumé
We revisit the problem of characterising the mean-field limit of a network of Hopfield-like neurons. Building on the previous work of [13, 1, 14] and [9] we establish for a large class of networks of Hopfield-like neurons, i.e. rate neurons, the mean-field equations on a time interval [0, T], T > 0, of the thermodynamic limit of these networks, i.e. the limit when the number of neurons goes to infinity. Unlike all previous work, except [9], we do not assume that the synaptic weights describing the connections between the neurons are i.i.d. as zero-mean Gaussians. The limit equations are stochastic and very simply described in terms of two functions, a "correlation" function noted KQ(t,s) and a "mean" function noted mQ (t). The "noise" part of the equations is a linear function of the Brownian motion, which is obtained by solving a Volterra equation of the second kind whose resolving kernel is expressed as a function of KQ. We give a constructive proof of the uniqueness of the limit equations. We use the corresponding algorithm for an effective computation of the functions KQ and mQ , given the weights distribution. Several numerical experiments are reported.
Origine | Fichiers produits par l'(les) auteur(s) |
---|---|
Licence |