Analogical reasoning is effective in capturing linguistic regularities.
Chit-chat models are known to have several problems: they lack specificity, do not display a consistent personality and are often not very captivating.
Subword units are an effective way to alleviate the open vocabulary problems in neural machine translation (NMT).
We investigate a lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon.
#4 best model for Chinese Named Entity Recognition on Weibo NER
Inspired by how humans summarize long documents, we propose an accurate and fast summarization model that first selects salient sentences and then rewrites them abstractively (i. e., compresses and paraphrases) to generate a concise overall summary.
Recent work has managed to learn cross-lingual word embeddings without parallel data by mapping monolingual embeddings to a shared space through adversarial training.
Inductive transfer learning has greatly impacted computer vision, but existing approaches in NLP still require task-specific modifications and training from scratch.
#2 best model for Sentiment Analysis on Yelp Fine-grained classification