Search Results for author: Kotaro Funakoshi

Found 13 papers, 3 papers with code

A-TIP: Attribute-aware Text Infilling via Pre-trained Language Model

no code implementations COLING 2022 Dongyuan Li, Jingyi You, Kotaro Funakoshi, Manabu Okumura

Text infilling aims to restore incomplete texts by filling in blanks, which has attracted more attention recently because of its wide application in ancient text restoration and text rewriting.

Ancient Text Restoration Language Modelling +1

Joint Learning-based Heterogeneous Graph Attention Network for Timeline Summarization

no code implementations NAACL 2022 Jingyi You, Dongyuan Li, Hidetaka Kamigaito, Kotaro Funakoshi, Manabu Okumura

Previous studies on the timeline summarization (TLS) task ignored the information interaction between sentences and dates, and adopted pre-defined unlearnable representations for them.

Event Detection Graph Attention +1

Non-Axiomatic Term Logic: A Computational Theory of Cognitive Symbolic Reasoning

no code implementations12 Oct 2022 Kotaro Funakoshi

This paper presents Non-Axiomatic Term Logic (NATL) as a theoretical computational framework of humanlike symbolic reasoning in artificial intelligence.

Towards Table-to-Text Generation with Numerical Reasoning

1 code implementation ACL 2021 Lya Hulliyyatus Suadaa, Hidetaka Kamigaito, Kotaro Funakoshi, Manabu Okumura, Hiroya Takamura

In summary, our contributions are (1) a new dataset for numerical table-to-text generation using pairs of a table and a paragraph of a table description with richer inference from scientific papers, and (2) a table-to-text generation framework enriched with numerical reasoning.

Table-to-Text Generation

Generating Weather Comments from Meteorological Simulations

1 code implementation EACL 2021 Soichiro Murakami, Sora Tanaka, Masatsugu Hangyo, Hidetaka Kamigaito, Kotaro Funakoshi, Hiroya Takamura, Manabu Okumura

The task of generating weather-forecast comments from meteorological simulations has the following requirements: (i) the changes in numerical values for various physical quantities need to be considered, (ii) the weather comments should be dependent on delivery time and area information, and (iii) the comments should provide useful information for users.

Informativeness

A POS Tagging Model Adapted to Learner English

no code implementations WS 2018 Ryo Nagata, Tomoya Mizumoto, Yuta Kikuchi, Yoshifumi Kawasaki, Kotaro Funakoshi

Based on the discussion of possible causes of POS tagging errors in learner English, we show that deep neural models are particularly suitable for this.

Grammatical Error Correction Part-Of-Speech Tagging +1

Cannot find the paper you are looking for? You can Submit a new open access paper.