Enhancing Sentence Embedding with Generalized Pooling

COLING 2018  ·  Qian Chen, Zhen-Hua Ling, Xiaodan Zhu ·

Pooling is an essential component of a wide variety of sentence representation and embedding models. This paper explores generalized pooling methods to enhance sentence embedding. We propose vector-based multi-head attention that includes the widely used max pooling, mean pooling, and scalar self-attention as special cases. The model benefits from properly designed penalization terms to reduce redundancy in multi-head attention. We evaluate the proposed model on three different tasks: natural language inference (NLI), author profiling, and sentiment classification. The experiments show that the proposed model achieves significant improvement over strong sentence-encoding-based methods, resulting in state-of-the-art performances on four datasets. The proposed approach can be easily implemented for more problems than we discuss in this paper.

PDF Abstract COLING 2018 PDF COLING 2018 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Natural Language Inference SNLI 600D BiLSTM with generalized pooling % Test Accuracy 86.6 # 53
% Train Accuracy 94.9 # 13
Parameters 65m # 4
Sentiment Analysis Yelp Fine-grained classification BiLSTM generalized pooling Error 33.45 # 10

Methods


No methods listed for this paper. Add relevant methods here