Compressing Word Embeddings via Deep Compositional Code Learning

ICLR 2018 Raphael ShuHideki Nakayama

Natural language processing (NLP) models often require a massive number of parameters for word embeddings, resulting in a large storage or memory footprint. Deploying neural NLP models to mobile devices requires compressing the word embeddings without any significant sacrifices in performance... (read more)

PDF Abstract ICLR 2018 PDF ICLR 2018 Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Machine Translation IWSLT2015 German-English DCCL BLEU score 29.56 # 11

Methods used in the Paper


METHOD TYPE
🤖 No Methods Found Help the community by adding them if they're not listed; e.g. Deep Residual Learning for Image Recognition uses ResNet