Search Results for author: Kosuke Akimoto

Found 5 papers, 1 papers with code

Low-resource Taxonomy Enrichment with Pretrained Language Models

no code implementations EMNLP 2021 Kunihiro Takeoka, Kosuke Akimoto, Masafumi Oyamada

Conventional supervised methods for this enrichment task fail to find optimal parents of new terms in low-resource settings where only small taxonomies are available because of overfitting to hierarchical relationships in the taxonomies.

Context Quality Matters in Training Fusion-in-Decoder for Extractive Open-Domain Question Answering

no code implementations21 Mar 2024 Kosuke Akimoto, Kunihiro Takeoka, Masafumi Oyamada

Finally, based on these observations, we propose a method to mitigate overfitting to specific context quality by introducing bias to the cross-attention distribution, which we demonstrate to be effective in improving the performance of FiD models on different context quality.

Language Modelling Open-Domain Question Answering +1

Learning Directly from Grammar Compressed Text

1 code implementation28 Feb 2020 Yoichi Sasaki, Kosuke Akimoto, Takanori Maehara

Neural networks using numerous text data have been successfully applied to a variety of tasks.

Computational Efficiency

Cross-Sentence N-ary Relation Extraction using Lower-Arity Universal Schemas

no code implementations IJCNLP 2019 Kosuke Akimoto, Takuya Hiraoka, Kunihiko Sadamasa, Mathias Niepert

Most existing relation extraction approaches exclusively target binary relations, and n-ary relation extraction is relatively unexplored.

Relation Relation Extraction +1

Knowledge-Based Distant Regularization in Learning Probabilistic Models

no code implementations29 Jun 2018 Naoya Takeishi, Kosuke Akimoto

Exploiting the appropriate inductive bias based on the knowledge of data is essential for achieving good performance in statistical machine learning.

Inductive Bias Knowledge Graph Embeddings

Cannot find the paper you are looking for? You can Submit a new open access paper.