Reservoir Computing for Robot Language Acquisition - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2016

Reservoir Computing for Robot Language Acquisition

Xavier Hinaut

Résumé

How do children learn language? Could we use robots to model children language acquisition? This question is linked to a more general issue: how does the brain associate sequences of symbols to internal symbolic or sub-symbolic representations? I will present a Recurrent Neural Network (RNN), namely an Echo State Network (ESN) or Reservoir, that performs sentence comprehension and can be used for Human-Robot Interaction. The RNN is trained to map sentence structures to meanings (i.e. predicates). This model has interesting capabilities, for instance it can learn to "understand" French and English at the same time. Moreover, it is flexible and can be trained on different kinds of output predicate representations. The objective of this model is double: to improve HRI and provide neural models of language acquisition. From the HRI point of view, this model enables one (1) to gain adaptability because the system is trained on corpus examples (no need to predefine a parser for each language), (2) to be able to process natural language sentences instead of stereotypical sentences (i.e. "put cup left"), and (3) to be able to generalize to unknown sentence structures (not in the training data set). From the computational neuroscience and developmental robotics point of view, the aim of this architecture is to model and test hypotheses about child learning processes of language acquisition (Tomasello 2003).
Fichier non déposé

Dates et versions

hal-01417683 , version 1 (15-12-2016)

Identifiants

  • HAL Id : hal-01417683 , version 1

Citer

Xavier Hinaut. Reservoir Computing for Robot Language Acquisition. IROS Workshop on Machine Learning Methods for High-Level Cognitive Capabilities in Robotics, Oct 2016, Daejon, South Korea. ⟨hal-01417683⟩
243 Consultations
0 Téléchargements

Partager

More