Multi-Task Attentive Residual Networks for Argument Mining

24 Feb 2021  ·  Andrea Galassi, Marco Lippi, Paolo Torroni ·

We explore the use of residual networks and neural attention for argument mining and in particular link prediction. The method we propose makes no assumptions on document or argument structure. We propose a residual architecture that exploits attention, multi-task learning, and makes use of ensemble. We evaluate it on a challenging data set consisting of user-generated comments, as well as on two other datasets consisting of scientific publications. On the user-generated content dataset, our model outperforms state-of-the-art methods that rely on domain knowledge. On the scientific literature datasets it achieves results comparable to those yielded by BERT-based approaches but with a much smaller model size.

PDF Abstract

Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Relation Classification AbstRCT - Neoplasm ResAttArg Macro F1 70.92 # 1
Link Prediction AbstRCT - Neoplasm ResAttArg F1 54.43 # 1
Link Prediction CDCP ResAttArg F1 29.73 # 1
Relation Classification CDCP ResAttArg Macro F1 42.95 # 1
Component Classification CDCP ResAttArg Macro F1 78.71 # 1
Link Prediction DRI Corpus ResAttArg F1 43.66 # 1
Relation Classification DRI Corpus ResAttArg Macro F1 37.72 # 1


No methods listed for this paper. Add relevant methods here