Information Aggregation via Dynamic Routing for Sequence Encoding

While much progress has been made in how to encode a text sequence into a sequence of vectors, less attention has been paid to how to aggregate these preceding vectors (outputs of RNN/CNN) into fixed-size encoding vector. Usually, a simple max or average pooling is used, which is a bottom-up and passive way of aggregation and lack of guidance by task information. In this paper, we propose an aggregation mechanism to obtain a fixed-size encoding with a dynamic routing policy. The dynamic routing policy is dynamically deciding that what and how much information need be transferred from each word to the final encoding of the text sequence. Following the work of Capsule Network, we design two dynamic routing policies to aggregate the outputs of RNN/CNN encoding layer into a final encoding vector. Compared to the other aggregation methods, dynamic routing can refine the messages according to the state of final encoding vector. Experimental results on five text classification tasks show that our method outperforms other aggregating models by a significant margin. Related source code is released on our github page.

PDF Abstract COLING 2018 PDF COLING 2018 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Sentiment Analysis IMDb Standard DR-AGG Accuracy 45.1 # 44
Sentiment Analysis IMDb Reverse DR-AGG Accuracy 44.5 # 45
Sentiment Analysis SST-2 Binary classification Reverse DR-AGG Accuracy 87.2 # 73
Sentiment Analysis SST-2 Binary classification Standard DR-AGG Accuracy 87.6 # 71

Methods