Search Results for author: Philip Arthur

Found 11 papers, 4 papers with code

XNMT: The eXtensible Neural Machine Translation Toolkit

1 code implementation WS 2018 Graham Neubig, Matthias Sperber, Xinyi Wang, Matthieu Felix, Austin Matthews, Sarguna Padmanabhan, Ye Qi, Devendra Singh Sachan, Philip Arthur, Pierre Godard, John Hewitt, Rachid Riad, Liming Wang

In this paper we describe the design of XNMT and its experiment configuration system, and demonstrate its utility on the tasks of machine translation, speech recognition, and multi-tasked machine translation/parsing.

Machine Translation NMT +3

Incorporating Discrete Translation Lexicons into Neural Machine Translation

2 code implementations EMNLP 2016 Philip Arthur, Graham Neubig, Satoshi Nakamura

Neural machine translation (NMT) often makes mistakes in translating low-frequency content words that are essential to understanding the meaning of the sentence.

Machine Translation NMT +2

Multilingual Neural Machine Translation With Soft Decoupled Encoding

1 code implementation ICLR 2019 Xinyi Wang, Hieu Pham, Philip Arthur, Graham Neubig

Multilingual training of neural machine translation (NMT) systems has led to impressive accuracy improvements on low-resource languages.

Machine Translation NMT +1

ParaCotta: Synthetic Multilingual Paraphrase Corpora from the Most Diverse Translation Sample Pair

no code implementations PACLIC 2021 Alham Fikri Aji, Tirana Noor Fatyanosa, Radityo Eko Prasojo, Philip Arthur, Suci Fitriany, Salma Qonitah, Nadhifa Zulfa, Tomi Santoso, Mahendra Data

We release our synthetic parallel paraphrase corpus across 17 languages: Arabic, Catalan, Czech, German, English, Spanish, Estonian, French, Hindi, Indonesian, Italian, Dutch, Romanian, Russian, Swedish, Vietnamese, and Chinese.

Machine Translation Sentence +1

Cannot find the paper you are looking for? You can Submit a new open access paper.