Evolving reservoir weights in the frequency domain
Résumé
Reservoir Computing models are a class of recurrent neural networks that have enjoyed recent attention, in particular, their main family, Echo State Networks (ESNs). These models have a large number of hidden-hidden weights (in the so-called reservoir) forming a recurrent topology. The reservoir is randomly connected with fixed weights during learning: only readout parameters (from reservoir to output neurons) are trained; the reservoir weights are frozen after initialized. Since the reservoir structure is fixed during learning, only its initialization process has an impact on the model's performance. In this work, we introduce an evolutionary method for adjusting the reservoir non-null weights. Moreover, the evolutionary process runs on the frequency space corresponding to a Fourier transformation of the weights. We combine an evolutionary search in the Fourier space with supervised learning for the readout weights. The resulting algorithm, called EvoESN (Evolutionary ESN), obtains promising results modeling two well-known problems of the chaotic time-series area.
Origine | Fichiers produits par l'(les) auteur(s) |
---|