Topological and Dynamical Complexity of Random Neural Networks
Résumé
Random neural networks are dynamical descriptions of randomly interconnected neural units. These show a phase transition to chaos as a disorder parameter is increased. The microscopic mechanisms underlying this phase transition are unknown, and similarly to spin-glasses, shall be fundamentally related to the behavior of the system. In this Letter we investigate the explosion of complexity arising near that phase transition. We show that the mean number of equilibria undergoes a sharp transition from one equilibrium to a very large number scaling exponentially with the dimension on the system. Near criticality, we compute the exponential rate of divergence, called topological complexity. Strikingly, we show that it behaves exactly as the maximal Lyapunov exponent, a classical measure of dynamical complexity. This relationship unravels a microscopic mechanism leading to chaos which we further demonstrate on a simpler class of disordered systems, suggesting a deep and underexplored link between topological and dynamical complexity.