Dynamic Meta-Embeddings for Improved Sentence Representations

EMNLP 2018  ·  Douwe Kiela, Changhan Wang, Kyunghyun Cho ·

While one of the first steps in many NLP systems is selecting what pre-trained word embeddings to use, we argue that such a step is better left for neural networks to figure out by themselves. To that end, we introduce dynamic meta-embeddings, a simple yet effective method for the supervised learning of embedding ensembles, which leads to state-of-the-art performance within the same model class on a variety of tasks. We subsequently show how the technique can be used to shed new light on the usage of word embeddings in NLP systems.

PDF Abstract EMNLP 2018 PDF EMNLP 2018 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Natural Language Inference SNLI 512D Dynamic Meta-Embeddings % Test Accuracy 86.7 # 52
% Train Accuracy 91.6 # 34
Parameters 9m # 4

Methods


No methods listed for this paper. Add relevant methods here