Search Results for author: Akiko Eriguchi

Found 17 papers, 5 papers with code

Tree-to-Sequence Attentional Neural Machine Translation

1 code implementation ACL 2016 Akiko Eriguchi, Kazuma Hashimoto, Yoshimasa Tsuruoka

Most of the existing Neural Machine Translation (NMT) models focus on the conversion of sequential data and do not directly use syntactic information.

Machine Translation NMT +2

Character-based Decoding in Tree-to-Sequence Attention-based Neural Machine Translation

no code implementations WS 2016 Akiko Eriguchi, Kazuma Hashimoto, Yoshimasa Tsuruoka

This paper reports our systems (UT-AKY) submitted in the 3rd Workshop of Asian Translation 2016 (WAT{'}16) and their results in the English-to-Japanese translation task.

Machine Translation NMT +1

Multilingual Extractive Reading Comprehension by Runtime Machine Translation

1 code implementation10 Sep 2018 Akari Asai, Akiko Eriguchi, Kazuma Hashimoto, Yoshimasa Tsuruoka

Given a target language without RC training data and a pivot language with RC training data (e. g. English), our method leverages existing RC resources in the pivot language by combining a competitive RC model in the pivot language with an attentive Neural Machine Translation (NMT) model.

Machine Translation NMT +2

Zero-Shot Cross-lingual Classification Using Multilingual Neural Machine Translation

no code implementations12 Sep 2018 Akiko Eriguchi, Melvin Johnson, Orhan Firat, Hideto Kazawa, Wolfgang Macherey

However, little attention has been paid to leveraging representations learned by a multilingual NMT system to enable zero-shot multilinguality in other NLP tasks.

Classification Cross-Lingual Transfer +6

Lingvo: a Modular and Scalable Framework for Sequence-to-Sequence Modeling

2 code implementations21 Feb 2019 Jonathan Shen, Patrick Nguyen, Yonghui Wu, Zhifeng Chen, Mia X. Chen, Ye Jia, Anjuli Kannan, Tara Sainath, Yuan Cao, Chung-Cheng Chiu, Yanzhang He, Jan Chorowski, Smit Hinsu, Stella Laurenzo, James Qin, Orhan Firat, Wolfgang Macherey, Suyog Gupta, Ankur Bapna, Shuyuan Zhang, Ruoming Pang, Ron J. Weiss, Rohit Prabhavalkar, Qiao Liang, Benoit Jacob, Bowen Liang, HyoukJoong Lee, Ciprian Chelba, Sébastien Jean, Bo Li, Melvin Johnson, Rohan Anil, Rajat Tibrewal, Xiaobing Liu, Akiko Eriguchi, Navdeep Jaitly, Naveen Ari, Colin Cherry, Parisa Haghani, Otavio Good, Youlong Cheng, Raziel Alvarez, Isaac Caswell, Wei-Ning Hsu, Zongheng Yang, Kuan-Chieh Wang, Ekaterina Gonina, Katrin Tomanek, Ben Vanik, Zelin Wu, Llion Jones, Mike Schuster, Yanping Huang, Dehao Chen, Kazuki Irie, George Foster, John Richardson, Klaus Macherey, Antoine Bruguier, Heiga Zen, Colin Raffel, Shankar Kumar, Kanishka Rao, David Rybach, Matthew Murray, Vijayaditya Peddinti, Maxim Krikun, Michiel A. U. Bacchiani, Thomas B. Jablin, Rob Suderman, Ian Williams, Benjamin Lee, Deepti Bhatia, Justin Carlson, Semih Yavuz, Yu Zhang, Ian McGraw, Max Galkin, Qi Ge, Golan Pundak, Chad Whipkey, Todd Wang, Uri Alon, Dmitry Lepikhin, Ye Tian, Sara Sabour, William Chan, Shubham Toshniwal, Baohua Liao, Michael Nirschl, Pat Rondon

Lingvo is a Tensorflow framework offering a complete solution for collaborative deep learning research, with a particular focus towards sequence-to-sequence models.

Sequence-To-Sequence Speech Recognition

Combining Translation Memory with Neural Machine Translation

no code implementations WS 2019 Akiko Eriguchi, Spencer Rarrick, Hitokazu Matsushita

In this paper, we report our submission systems (geoduck) to the Timely Disclosure task on the 6th Workshop on Asian Translation (WAT) (Nakazawa et al., 2019).

Machine Translation NMT +2

Neural Text Generation with Artificial Negative Examples

no code implementations28 Dec 2020 Keisuke Shirai, Kazuma Hashimoto, Akiko Eriguchi, Takashi Ninomiya, Shinsuke Mori

In this paper, we propose to suppress an arbitrary type of errors by training the text generation model in a reinforcement learning framework, where we use a trainable reward function that is capable of discriminating between references and sentences containing the targeted type of errors.

Image Captioning Machine Translation +2

Building Multilingual Machine Translation Systems That Serve Arbitrary X-Y Translations

no code implementations30 Jun 2022 Akiko Eriguchi, Shufang Xie, Tao Qin, Hany Hassan Awadalla

Multilingual Neural Machine Translation (MNMT) enables one system to translate sentences from multiple source languages to multiple target languages, greatly reducing deployment costs compared with conventional bilingual systems.

Machine Translation Translation

Building Multilingual Machine Translation Systems That Serve Arbitrary XY Translations

no code implementations NAACL 2022 Akiko Eriguchi, Shufang Xie, Tao Qin, Hany Hassan

Multilingual Neural Machine Translation (MNMT) enables one system to translate sentences from multiple source languages to multiple target languages, greatly reducing deployment costs compared with conventional bilingual systems.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.