Latency-based probabilistic information processing in a learning feedback hierarchy
Résumé
In this article, we study a three-layer neural hierarchy composed of bi-directionally connected recurrent layers which is trained to perform a synthetic object recognition task. The main feature of this network is its ability to represent, transmit and fuse probabilistic information, and thus to take near-optimal decisions when inputs are contradictory, noisy or missing. This is achieved by a neural space-latency code which is a natural consequence of the simple recurrent dynamics in each layer. Furthermore, the network possesses a feedback mechanism that is compatible with the space-latency code by making use of the attractor properties of neural layers. We show that this feedback mechanism can resolve/correct ambiguities at lower levels. As the fusion of feedback information in each layer is achieved in a probabilistically coherent fashion, feedback only has an effect if low-level inputs are ambiguous.
Domaines
Apprentissage [cs.LG]Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...