no code implementations • 2 Sep 2019 • Hayahide Yamagishi, Mamoru Komachi
We propose a weight sharing method wherein NMT saves decoder states and calculates an attention vector using the saved states when translating a current sentence.
no code implementations • WS 2019 • Ryoma Yoshimura, Hiroki Shimanaka, Yukio Matsumura, Hayahide Yamagishi, Mamoru Komachi
We use the outputs of off-the-shelf MT systems as pseudo-references filtered by paraphrasing in addition to a single human reference (gold reference).
1 code implementation • NAACL 2019 • Tosho Hirasawa, Hayahide Yamagishi, Yukio Matsumura, Mamoru Komachi
Multimodal machine translation is an attractive application of neural machine translation (NMT).
1 code implementation • ACL 2018 • Satoru Katsumata, Yukio Matsumura, Hayahide Yamagishi, Mamoru Komachi
For Japanese-to-English translation, this method achieves a BLEU score that is 0. 56 points more than that of a baseline.
no code implementations • NAACL 2018 • Michiki Kurosawa, Yukio Matsumura, Hayahide Yamagishi, Mamoru Komachi
Neural machine translation (NMT) has a drawback in that can generate only high-frequency words owing to the computational costs of the softmax function in the output layer.
no code implementations • IJCNLP 2017 • Hayahide Yamagishi, Shin Kanouchi, Takayuki Sato, Mamoru Komachi
This study reports an attempt to predict the voice of reference using the information from the input sentences or previous input/output sentences.
no code implementations • WS 2016 • Hayahide Yamagishi, Shin Kanouchi, Takayuki Sato, Mamoru Komachi
The results showed that, we could control the voice of the generated sentence with 85. 0{\%} accuracy on average.