Learning Connective-based Word Representations for Implicit Discourse Relation Identification - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year :

Learning Connective-based Word Representations for Implicit Discourse Relation Identification

Chloé Braud
Pascal Denis

Abstract

We introduce a simple semi-supervised approach to improve implicit discourse relation identification. This approach harnesses large amounts of automatically extracted discourse connectives along with their arguments to construct new distributional word representations. Specifically, we represent words in the space of discourse connectives as a way to directly encode their rhetorical function. Experiments on the Penn Discourse Treebank demonstrate the effectiveness of these task-tailored representations in predicting implicit discourse relations. Our results indeed show that, despite their simplicity, these connective-based representations outperform various off-the-shelf word embeddings, and achieve state-of-the-art performance on this problem.
Fichier principal
Vignette du fichier
emnlp16.pdf (214.31 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01397318 , version 1 (15-12-2016)

Identifiers

  • HAL Id : hal-01397318 , version 1

Cite

Chloé Braud, Pascal Denis. Learning Connective-based Word Representations for Implicit Discourse Relation Identification. Empirical Methods on Natural Language Processing, Nov 2016, Austin, United States. ⟨hal-01397318⟩
337 View
267 Download

Share

Gmail Facebook Twitter LinkedIn More