no code implementations • 11 Oct 2022 • Masaaki Nishino, Kengo Nakamura, Norihito Yasuda
In practical situations, it is natural to expect the input-output pairs of a machine learning model to satisfy some requirements.
1 code implementation • 2 Mar 2021 • Hikaru Shindo, Masaaki Nishino, Akihiro Yamamoto
Our framework can be scaled to deal with complex programs that consist of several clauses with function symbols.
1 code implementation • COLING 2020 • Katsuki Chousa, Masaaki Nagata, Masaaki Nishino
In particular, our method improved by +53. 9 F1 scores for extracting non-parallel sentences.
no code implementations • EMNLP 2020 • Masaaki Nagata, Chousa Katsuki, Masaaki Nishino
For example, we achieved an F1 score of 86. 7 for the Chinese-English data, which is 13. 3 points higher than the previous state-of-the-art supervised methods.
no code implementations • 29 Apr 2020 • Katsuki Chousa, Masaaki Nagata, Masaaki Nishino
We also conduct a sentence alignment experiment using En-Ja newspaper articles and find that the proposed method using multilingual BERT achieves significantly better accuracy than a baseline method using a bilingual dictionary and dynamic programming.
no code implementations • 6 Apr 2020 • Kengo Nakamura, Shuhei Denzumi, Masaaki Nishino
We show that VS-SDDs are never larger than SDDs and there are cases in which the size of a VS-SDD is exponentially smaller than that of an SDD.
no code implementations • 24 Mar 2020 • Hiroki Ikeuchi, Akio Watanabe, Tsutomu Hirao, Makoto Morishita, Masaaki Nishino, Yoichi Matsuo, Keishiro Watanabe
With the increase in scale and complexity of ICT systems, their operation increasingly requires automatic recovery from failures.
1 code implementation • 9 Mar 2020 • Hikaru Shindo, Masaaki Nishino, Yasuaki Kobayashi, Akihiro Yamamoto
In order to perform metric learning based on pq-grams, we propose a new differentiable parameterized distance, weighted pq-gram distance.
no code implementations • IJCNLP 2019 • Masaaki Nishino, Sho Takase, Tsutomu Hirao, Masaaki Nagata
An anagram is a sentence or a phrase that is made by permutating the characters of an input sentence or a phrase.
no code implementations • NAACL 2018 • Shinsaku Sakaue, Tsutomu Hirao, Masaaki Nishino, Masaaki Nagata
This approach is known to have three advantages: its applicability to many useful submodular objective functions, the efficiency of the greedy algorithm, and the provable performance guarantee.
no code implementations • ACL 2017 • Tsutomu Hirao, Masaaki Nishino, Masaaki Nagata
This paper derives an Integer Linear Programming (ILP) formulation to obtain an oracle summary of the compressive summarization paradigm in terms of ROUGE.
no code implementations • EACL 2017 • Tsutomu Hirao, Masaaki Nishino, Jun Suzuki, Masaaki Nagata
To analyze the limitations and the future directions of the extractive summarization paradigm, this paper proposes an Integer Linear Programming (ILP) formulation to obtain extractive oracle summaries in terms of ROUGE-N. We also propose an algorithm that enumerates all of the oracle summaries for a set of reference summaries to exploit F-measures that evaluate which system summaries contain how many sentences that are extracted as an oracle summary.
no code implementations • COLING 2016 • Xun Wang, Masaaki Nishino, Tsutomu Hirao, Katsuhito Sudoh, Masaaki Nagata
Existing methods focus on the extraction of key information, but often neglect coherence.