Search Results for author: Atsushi Otsuka

Found 5 papers, 0 papers with code

Length-controllable Abstractive Summarization by Guiding with Summary Prototype

no code implementations21 Jan 2020 Itsumi Saito, Kyosuke Nishida, Kosuke Nishida, Atsushi Otsuka, Hisako Asano, Junji Tomita, Hiroyuki Shindo, Yuji Matsumoto

Unlike the previous models, our length-controllable abstractive summarization model incorporates a word-level extractive module in the encoder-decoder model instead of length embeddings.

Abstractive Text Summarization

Multi-style Generative Reading Comprehension

no code implementations ACL 2019 Kyosuke Nishida, Itsumi Saito, Kosuke Nishida, Kazutoshi Shinoda, Atsushi Otsuka, Hisako Asano, Junji Tomita

Second, whereas previous studies built a specific model for each answer style because of the difficulty of acquiring one general model, our approach learns multi-style answers within a model to improve the NLG capability for all styles involved.

Abstractive Text Summarization Question Answering +2

Retrieve-and-Read: Multi-task Learning of Information Retrieval and Reading Comprehension

no code implementations31 Aug 2018 Kyosuke Nishida, Itsumi Saito, Atsushi Otsuka, Hisako Asano, Junji Tomita

Previous MRS studies, in which the IR component was trained without considering answer spans, struggled to accurately find a small number of relevant passages from a large set of passages.

Information Retrieval Multi-Task Learning +2

Cannot find the paper you are looking for? You can Submit a new open access paper.