Interactively-Propagative Attention Learning for Implicit Discourse Relation Recognition
We tackle implicit discourse relation recognition. Both self-attention and interactive-attention mechanisms have been applied for attention-aware representation learning, which improves the current discourse analysis models. To take advantages of the two attention mechanisms simultaneously, we develop a propagative attention learning model using a cross-coupled two-channel network. We experiment on Penn Discourse Treebank. The test results demonstrate that our model yields substantial improvements over the baselines (BiLSTM and BERT).PDF Abstract
No code implementations yet. Submit your code now
Results from the Paper
Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.
No methods listed for this paper. Add relevant methods here