no code implementations • WMT (EMNLP) 2020 • Shun Kiyono, Takumi Ito, Ryuto Konno, Makoto Morishita, Jun Suzuki
In this paper, we describe the submission of Tohoku-AIP-NTT to the WMT’20 news translation task.
no code implementations • WMT (EMNLP) 2021 • Farhad Akhbardeh, Arkady Arkhangorodsky, Magdalena Biesialska, Ondřej Bojar, Rajen Chatterjee, Vishrav Chaudhary, Marta R. Costa-Jussa, Cristina España-Bonet, Angela Fan, Christian Federmann, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Barry Haddow, Leonie Harter, Kenneth Heafield, Christopher Homan, Matthias Huck, Kwabena Amponsah-Kaakyire, Jungo Kasai, Daniel Khashabi, Kevin Knight, Tom Kocmi, Philipp Koehn, Nicholas Lourie, Christof Monz, Makoto Morishita, Masaaki Nagata, Ajay Nagesh, Toshiaki Nakazawa, Matteo Negri, Santanu Pal, Allahsera Auguste Tapo, Marco Turchi, Valentin Vydrin, Marcos Zampieri
This paper presents the results of the newstranslation task, the multilingual low-resourcetranslation for Indo-European languages, thetriangular translation task, and the automaticpost-editing task organised as part of the Con-ference on Machine Translation (WMT) 2021. In the news task, participants were asked tobuild machine translation systems for any of10 language pairs, to be evaluated on test setsconsisting mainly of news stories.
no code implementations • WAT 2022 • Toshiaki Nakazawa, Hideya Mino, Isao Goto, Raj Dabre, Shohei Higashiyama, Shantipriya Parida, Anoop Kunchukuttan, Makoto Morishita, Ondřej Bojar, Chenhui Chu, Akiko Eriguchi, Kaori Abe, Yusuke Oda, Sadao Kurohashi
This paper presents the results of the shared tasks from the 9th workshop on Asian translation (WAT2022).
no code implementations • 29 Aug 2024 • Yunmeng Li, Jun Suzuki, Makoto Morishita, Kaori Abe, Kentaro Inui
The complexities of chats pose significant challenges for machine translation models.
no code implementations • 28 Aug 2024 • Yunmeng Li, Jun Suzuki, Makoto Morishita, Kaori Abe, Kentaro Inui
Machine translation models are still inappropriate for translating chats, despite the popularity of translation software and plug-in applications.
1 code implementation • 8 Aug 2024 • Masashi Oshika, Makoto Morishita, Tsutomu Hirao, Ryohei Sasano, Koichi Takeda
In this study, we propose a method that replaces words with high Age of Acquisitions (AoA) in translations with simpler words to match the translations to the user's level.
no code implementations • 15 May 2024 • Masaaki Nagata, Makoto Morishita, Katsuki Chousa, Norihito Yasuda
Using crowdsourcing, we collected more than 10, 000 URL pairs (parallel top page pairs) of bilingual websites that contain parallel documents and created a Japanese-Chinese parallel corpus of 4. 6M sentence pairs from these websites.
1 code implementation • 13 Apr 2024 • Hayato Tsukagoshi, Tsutomu Hirao, Makoto Morishita, Katsuki Chousa, Ryohei Sasano, Koichi Takeda
The task of Split and Rephrase, which splits a complex sentence into multiple simple sentences with the same meaning, improves readability and enhances the performance of downstream tasks in natural language processing (NLP).
no code implementations • 14 Feb 2024 • Yuto Nishida, Makoto Morishita, Hidetaka Kamigaito, Taro Watanabe
Generating multiple translation candidates would enable users to choose the one that satisfies their needs.
no code implementations • 20 Nov 2023 • Atsushi Shirafuji, Yusuke Oda, Jun Suzuki, Makoto Morishita, Yutaka Watanobe
A less complex and more straightforward program is a crucial factor that enhances its maintainability and makes writing secure and bug-free programs easier.
1 code implementation • 2 Aug 2023 • Yunmeng Li, Jun Suzuki, Makoto Morishita, Kaori Abe, Ryoko Tokuhisa, Ana Brassard, Kentaro Inui
In this paper, we describe the development of a communication support system that detects erroneous translations to facilitate crosslingual communications due to the limitations of current machine chat translation methods.
no code implementations • 26 Jun 2023 • Atsushi Shirafuji, Yutaka Watanobe, Takumi Ito, Makoto Morishita, Yuki Nakamura, Yusuke Oda, Jun Suzuki
Our experimental results show that CodeGen and Codex are sensitive to the superficial modifications of problem descriptions and significantly impact code generation performance.
no code implementations • 28 Oct 2022 • Makoto Morishita, Jun Suzuki, Masaaki Nagata
With the collected parallel data, we can quickly adapt a machine translation model to the target domain.
no code implementations • LREC 2022 • Makoto Morishita, Katsuki Chousa, Jun Suzuki, Masaaki Nagata
Most current machine translation models are mainly trained with parallel corpora, and their translation accuracy largely depends on the quality and quantity of the corpora.
no code implementations • ACL (WAT) 2021 • Katsuki Chousa, Makoto Morishita
This paper describes our systems that were submitted to the restricted translation task at WAT 2021.
1 code implementation • EACL 2021 • Makoto Morishita, Jun Suzuki, Tomoharu Iwata, Masaaki Nagata
It is crucial to provide an inter-sentence context in Neural Machine Translation (NMT) models for higher-quality translation.
no code implementations • EMNLP 2020 • Loïc Barrault, Magdalena Biesialska, Ondřej Bojar, Marta R. Costa-jussà, Christian Federmann, Yvette Graham, Roman Grundkiewicz, Barry Haddow, Matthias Huck, Eric Joanis, Tom Kocmi, Philipp Koehn, Chi-kiu Lo, Nikola Ljubešić, Christof Monz, Makoto Morishita, Masaaki Nagata, Toshiaki Nakazawa, Santanu Pal, Matt Post, Marcos Zampieri
In the news task, participants were asked to build machine translation systems for any of 11 language pairs, to be evaluated on test sets consisting mainly of news stories.
1 code implementation • COLING 2020 • Ryo Fujii, Masato Mita, Kaori Abe, Kazuaki Hanawa, Makoto Morishita, Jun Suzuki, Kentaro Inui
Neural Machine Translation (NMT) has shown drastic improvement in its quality when translating clean input, such as text from the news domain.
no code implementations • LREC 2020 • Masaaki Nagata, Makoto Morishita
We improved the translation accuracy using context-aware neural machine translation, and the improvement mainly reflects the betterment of the translation of zero pronouns.
no code implementations • 24 Mar 2020 • Hiroki Ikeuchi, Akio Watanabe, Tsutomu Hirao, Makoto Morishita, Masaaki Nishino, Yoichi Matsuo, Keishiro Watanabe
With the increase in scale and complexity of ICT systems, their operation increasingly requires automatic recovery from failures.
no code implementations • LREC 2020 • Makoto Morishita, Jun Suzuki, Masaaki Nagata
We constructed a parallel corpus for English-Japanese, for which the amount of publicly available parallel corpora is still limited.
no code implementations • WS 2019 • Makoto Morishita, Jun Suzuki, Masaaki Nagata
In this paper, we describe our systems that were submitted to the translation shared tasks at WAT 2019.
no code implementations • WS 2019 • Soichiro Murakami, Makoto Morishita, Tsutomu Hirao, Masaaki Nagata
This paper describes NTT's submission to the WMT19 robustness task.
no code implementations • WS 2018 • Makoto Morishita, Jun Suzuki, Masaaki Nagata
This paper describes NTT{'}s neural machine translation systems submitted to the WMT 2018 English-German and German-English news translation tasks.
no code implementations • COLING 2018 • Makoto Morishita, Jun Suzuki, Masaaki Nagata
We hypothesize that in the NMT model, the appropriate subword units for the following three modules (layers) can differ: (1) the encoder embedding layer, (2) the decoder embedding layer, and (3) the decoder output layer.
1 code implementation • ACL 2018 • Jun Suzuki, Sho Takase, Hidetaka Kamigaito, Makoto Morishita, Masaaki Nagata
This paper investigates the construction of a strong baseline based on general purpose sequence-to-sequence models for constituency parsing.
Ranked #18 on
Constituency Parsing
on Penn Treebank
no code implementations • WS 2017 • Makoto Morishita, Jun Suzuki, Masaaki Nagata
In this year, we participated in four translation subtasks at WAT 2017.
no code implementations • WS 2017 • Makoto Morishita, Yusuke Oda, Graham Neubig, Koichiro Yoshino, Katsuhito Sudoh, Satoshi Nakamura
Training of neural machine translation (NMT) models usually uses mini-batches for efficiency purposes.
no code implementations • WS 2015 • Graham Neubig, Makoto Morishita, Satoshi Nakamura
We further perform a detailed analysis of reasons for this increase, finding that the main contributions of the neural models lie in improvement of the grammatical correctness of the output, as opposed to improvements in lexical choice of content words.