Search Results for author: Tatsuya Ishigaki

Found 11 papers, 4 papers with code

Automating Horizon Scanning in Future Studies

no code implementations LREC 2022 Tatsuya Ishigaki, Suzuko Nishino, Sohei Washino, Hiroki Igarashi, Yukari Nagai, Yuichi Washida, Akihiko Murai

The contributions are: 1) we propose document retrieval and comment generation tasks for horizon scanning, 2) create and analyze a new dataset, and 3) report the performance of several models and show that comment generation tasks are challenging.

Comment Generation Retrieval

Generating Racing Game Commentary from Vision, Language, and Structured Data

no code implementations INLG (ACL) 2021 Tatsuya Ishigaki, Goran Topic, Yumi Hamazono, Hiroshi Noji, Ichiro Kobayashi, Yusuke Miyao, Hiroya Takamura

In this study, we introduce a new large-scale dataset that contains aligned video data, structured numerical data, and transcribed commentaries that consist of 129, 226 utterances in 1, 389 races in a game.

Prompting for Numerical Sequences: A Case Study on Market Comment Generation

1 code implementation3 Apr 2024 Masayuki Kawarada, Tatsuya Ishigaki, Hiroya Takamura

Large language models (LLMs) have been applied to a wide range of data-to-text generation tasks, including tables, graphs, and time-series numerical data-to-text settings.

Comment Generation Data-to-Text Generation +1

Learning with Contrastive Examples for Data-to-Text Generation

1 code implementation COLING 2020 Yui Uehara, Tatsuya Ishigaki, Kasumi Aoki, Hiroshi Noji, Keiichi Goshima, Ichiro Kobayashi, Hiroya Takamura, Yusuke Miyao

Existing models for data-to-text tasks generate fluent but sometimes incorrect sentences e. g., {``}Nikkei gains{''} is generated when {``}Nikkei drops{''} is expected.

Comment Generation Data-to-Text Generation

Discourse-Aware Hierarchical Attention Network for Extractive Single-Document Summarization

no code implementations RANLP 2019 Tatsuya Ishigaki, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura

To incorporate the information of a discourse tree structure into the neural network-based summarizers, we propose a discourse-aware neural extractive summarizer which can explicitly take into account the discourse dependency tree structure of the source document.

Document Summarization Sentence

Generating Market Comments Referring to External Resources

1 code implementation WS 2018 Tatsuya Aoki, Akira Miyazawa, Tatsuya Ishigaki, Keiichi Goshima, Kasumi Aoki, Ichiro Kobayashi, Hiroya Takamura, Yusuke Miyao

Comments on a stock market often include the reason or cause of changes in stock prices, such as {``}Nikkei turns lower as yen{'}s rise hits exporters.

Text Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.