A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Autre Publication Neural Computation Année : 2007

A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks

Résumé

The analysis of learning recurrent neural networks is challenging, because neuron activity and learning dynamics are mutually coupled: neuron activity depends on the synaptic weight network, which itself varies non trivially under the influence of neuron activity. Understanding this interwoven evolution demands adapted theoretical tools. In this article, we present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks. Using theoretical tools from dynamical systems and graph theory, we study a generic ``Hebb-like'' learning rule that can include passive forgetting and different time scales for neuron activity and learning dynamics. We first show that the classical structural statistics from the so-called ``complex networks'' field (degree distribution, mean-shortest path, clustering index, modularity) do not provide useful insights for the characterization of the coupling between neuron dynamics and network evolution. Instead, this coupling can be analyzed more efficiently by the study of Jacobian matrices, which introduce both a structural and a dynamical point view on the neural network evolution. In this way, we show that ``Hebb-like'' learning leads to a reduction of the complexity of the dynamics manifested by a systematic decay of the largest Lyapunov exponent. This effect is caused by a contraction of the spectral radius of Jacobian matrices, induced either by passive forgetting or by saturation of the neurons. As a consequence learning drives the system from chaos to a steady state through a sequence of bifurcations. We show that the network sensitivity to the input pattern is maximal at the ``edge of chaos''. We also emphasize the role of feedback circuits in the Jacobian matrices and the link to cooperative systems.
Fichier principal
Vignette du fichier
Sirietal_2007.pdf (445.49 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

inria-00149181 , version 1 (24-05-2007)
inria-00149181 , version 2 (07-04-2008)

Identifiants

  • HAL Id : inria-00149181 , version 1
  • ARXIV : 0705.3690

Citer

Benoit Siri, Hugues Berry, Bruno Cessac, Bruno Delord, Mathias Quoy. A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks. 2007. ⟨inria-00149181v1⟩
526 Consultations
347 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More