TACL 2017

Fully Character-Level Neural Machine Translation without Explicit Segmentation

TACL 2017 nyu-dl/dl4mt-c2c

We observe that on CS-EN, FI-EN and RU-EN, the quality of the multilingual character-level translation even surpasses the models specifically trained on that language pair alone, both in terms of BLEU score and human judgment.

MACHINE TRANSLATION

Context Gates for Neural Machine Translation

TACL 2017 tuzhaopeng/nmt

In neural machine translation (NMT), generation of a target word depends on both source and target contexts.

MACHINE TRANSLATION

Learning Distributed Representations of Texts and Entities from Knowledge Base

TACL 2017 studio-ousia/ntee

Given a text in the KB, we train our proposed model to predict entities that are relevant to the text.

ENTITY LINKING QUESTION ANSWERING

Replicability Analysis for Natural Language Processing: Testing Significance with Multiple Datasets

TACL 2017 rtmdrr/replicability-analysis-NLP

With the ever-growing amounts of textual data from a large variety of languages, domains, and genres, it has become standard to evaluate NLP algorithms on multiple datasets in order to ensure consistent performance across heterogeneous setups.

DEPENDENCY PARSING SENTIMENT ANALYSIS

Enriching Word Vectors with Subword Information

TACL 2017 luckyPT/jvm-ml

A vector representation is associated to each character $n$-gram; words being represented as the sum of these representations.

Google's Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

TACL 2017 Helsinki-NLP/shared-info

In addition to improving the translation quality of language pairs that the model was trained with, our models can also learn to perform implicit bridging between language pairs never seen explicitly during training, showing that transfer learning and zero-shot translation is possible for neural translation.

MACHINE TRANSLATION TRANSFER LEARNING