Long Short-Term Memory of Language Models for Predicting Brain Activation During Listening to Stories - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2022

Long Short-Term Memory of Language Models for Predicting Brain Activation During Listening to Stories

Subba Reddy Oota
Frederic Alexandre
Xavier Hinaut

Résumé

Several popular sequence-based and pretrained language models have been found to be successful for text-driven prediction of brain activations. However, these models still lack longterm memory plausibility (i.e. how they deal with long-term dependencies and contextual information) as well as insights on the underlying neural substrate mechanisms. This paper studies the influence of context representations of different language models such as sequence-based models: Long Short-Term Memory networks (LSTMs) and ELMo, and a pretrained Transformer language model (Longformer). In particular, we study how the internal hidden representations align with the brain activity observed via fMRI when the subjects listen to several narrative stories. We use brain imaging recordings of subjects listening to narrative stories to interpret word and sequence embeddings. We further investigate how the representations of language models layers reveal better semantic context during listening. Experiments across all language model representations provide the following cognitive insights: (i) the representations of LSTM cell states are better aligned with brain recordings than LSTM hidden states, the cell state activity can represent more long-term information, (ii) the representations of ELMo and Longformer display a good predictive performance across brain regions for listening stimuli; (iii) Posterior Medial Cortex (PMC), Temporo-Parieto-Occipital junction (TPOJ), and Dorsal Frontal Lobe (DFL) have higher correlation versus Early Auditory (EAC) and Auditory Association Cortex (AAC).
Fichier principal
Vignette du fichier
OotaAlexandreHinaut2022_CogSci_HAL-v1.pdf (1023.25 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03697443 , version 1 (16-06-2022)

Identifiants

  • HAL Id : hal-03697443 , version 1

Citer

Subba Reddy Oota, Frederic Alexandre, Xavier Hinaut. Long Short-Term Memory of Language Models for Predicting Brain Activation During Listening to Stories. CogSci 2022 - Cognitive Science Society, Jul 2022, Toronto, Canada. ⟨hal-03697443⟩
142 Consultations
203 Téléchargements

Partager

More