Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction

20 Feb 2021  ·  Benfeng Xu, Quan Wang, Yajuan Lyu, Yong Zhu, Zhendong Mao ·

Entities, as the essential elements in relation extraction tasks, exhibit certain structure. In this work, we formulate such structure as distinctive dependencies between mention pairs. We then propose SSAN, which incorporates these structural dependencies within the standard self-attention mechanism and throughout the overall encoding stage. Specifically, we design two alternative transformation modules inside each self-attention building block to produce attentive biases so as to adaptively regularize its attention flow. Our experiments demonstrate the usefulness of the proposed entity structure and the effectiveness of SSAN. It significantly outperforms competitive baselines, achieving new state-of-the-art results on three popular document-level relation extraction datasets. We further provide ablation and visualization to show how the entity structure guides the model for better relation extraction. Our code is publicly available.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Relation Extraction CDR SSANBiaffine F1 68.7 # 7
Relation Extraction DocRED SSAN-RoBERTa-large+Adaptation F1 65.92 # 3
Ign F1 63.78 # 3
Relation Extraction DocRED SSAN-RoBERTa-large F1 61.42 # 22
Ign F1 59.47 # 23
Relation Extraction DocRED SSAN-RoBERTa-base F1 59.94 # 33
Ign F1 57.71 # 34
Relation Extraction DocRED SSAN-BERT-base F1 58.16 # 44
Ign F1 55.84 # 44
Relation Extraction GDA SSANBiaffine F1 83.9 # 7

Methods


No methods listed for this paper. Add relevant methods here