Contextualized Non-local Neural Networks for Sequence Learning

21 Nov 2018  ·  Pengfei Liu, Shuaichen Chang, Xuanjing Huang, Jian Tang, Jackie Chi Kit Cheung ·

Recently, a large number of neural mechanisms and models have been proposed for sequence learning, of which self-attention, as exemplified by the Transformer model, and graph neural networks (GNNs) have attracted much attention. In this paper, we propose an approach that combines and draws on the complementary strengths of these two methods. Specifically, we propose contextualized non-local neural networks (CN$^{\textbf{3}}$), which can both dynamically construct a task-specific structure of a sentence and leverage rich local dependencies within a particular neighborhood. Experimental results on ten NLP tasks in text classification, semantic matching, and sequence labeling show that our proposed model outperforms competitive baselines and discovers task-specific dependency structures, thus providing better interpretability to users.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods