Analogical reasoning is effective in capturing linguistic regularities.
Chit-chat models are known to have several problems: they lack specificity, do not display a consistent personality and are often not very captivating.
Subword units are an effective way to alleviate the open vocabulary problems in neural machine translation (NMT).
Inspired by how humans summarize long documents, we propose an accurate and fast summarization model that first selects salient sentences and then rewrites them abstractively (i. e., compresses and paraphrases) to generate a concise overall summary.
Inductive transfer learning has greatly impacted computer vision, but existing approaches in NLP still require task-specific modifications and training from scratch.
#2 best model for Text Classification on TREC-6
Recent work has managed to learn cross-lingual word embeddings without parallel data by mapping monolingual embeddings to a shared space through adversarial training.