Augmenting Context Representation with Triggers Knowledge for Relation Extraction
Résumé
Relation Extraction (RE) requires the model to classify the correct relation from a set of relation candidates given the corresponding sentence and two entities. Recent work mainly studies how to utilize more data or incorporate extra context information especially with Pre-trained Language Models (PLMs). However, these models still face with the challenges of avoiding being affected by irrelevant or misleading words. In this paper, we propose a novel model to help alleviate such deficiency. Specifically, our model automatically mines the triggers of the sentence iteratively with the sentence itself from the previous iteration, and augment the semantics of the context representation from BERT with both entity pair and triggers skillfully. We conduct extensive experiments to evaluate the proposed model and effectively obtain empirical improvement in TACRED.