What does BERT learn about the structure of language? - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2019

What does BERT learn about the structure of language?


BERT is a recent language representation model that has surprisingly performed well in diverse language understanding benchmarks. This result indicates the possibility that BERT networks capture structural information about language. In this work, we provide novel support for this claim by performing a series of experiments to unpack the elements of English language structure learned by BERT. We first show that BERT's phrasal representation captures phrase-level information in the lower layers. We also show that BERT's intermediate layers encode a rich hierarchy of linguistic information, with surface features at the bottom, syntactic features in the middle and semantic features at the top. BERT turns out to require deeper layers when long-distance dependency information is required, e.g.~to track subject-verb agreement. Finally, we show that BERT representations capture linguistic information in a compositional way that mimics classical, tree-like structures.
Fichier principal
Vignette du fichier
intbert_acl19paper-3.pdf (502.6 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-02131630 , version 1 (04-06-2019)


  • HAL Id : hal-02131630 , version 1


Ganesh Jawahar, Benoît Sagot, Djamé Seddah. What does BERT learn about the structure of language?. ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Jul 2019, Florence, Italy. ⟨hal-02131630⟩
5517 View
14799 Download


Gmail Facebook X LinkedIn More