Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction - Inria - Institut national de recherche en sciences et technologies du numérique
Conference Papers Year : 2018

Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction

Abstract

Current state-of-the-art machine translation systems are based on encoder-decoder architectures, that first encode the input sequence, and then generate an output sequence based on the input encoding. Both are interfaced with an attention mechanism that recombines a fixed encoding of the source tokens based on the decoder state. We propose an alternative approach which instead relies on a single 2D convolutional neural network across both sequences. Each layer of our network re-codes source tokens on the basis of the output sequence produced so far. Attention-like properties are therefore pervasive throughout the network. Our model yields excellent results, outperforming state-of-the-art encoder-decoder systems, while being conceptually simpler and having fewer parameters.
Fichier principal
Vignette du fichier
paper.pdf (1.66 Mo) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-01851612 , version 1 (10-08-2018)
hal-01851612 , version 2 (03-09-2018)
hal-01851612 , version 3 (04-09-2018)

Identifiers

  • HAL Id : hal-01851612 , version 3

Cite

Maha Elbayad, Laurent Besacier, Jakob Verbeek. Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction. CoNLL 2018 - Conference on Computational Natural Language Learning, Oct 2018, Brussels, Belgium. pp.97-107. ⟨hal-01851612v3⟩
1041 View
896 Download

Share

More