Search Results for author: Katsuhiko Hayashi

Found 41 papers, 9 papers with code

Bayesian Argumentation-Scheme Networks: A Probabilistic Model of Argument Validity Facilitated by Argumentation Schemes

no code implementations EMNLP (ArgMining) 2021 Takahiro Kondo, Koki Washio, Katsuhiko Hayashi, Yusuke Miyao

We propose a methodology for representing the reasoning structure of arguments using Bayesian networks and predicate logic facilitated by argumentation schemes.

A Simple but Effective Closed-form Solution for Extreme Multi-label Learning

1 code implementation17 Jan 2025 Kazuma Onishi, Katsuhiko Hayashi

Many current high-performance XML models are composed of a lot of hyperparameters, which complicates the tuning process.

Multi-Label Learning

Can Impressions of Music be Extracted from Thumbnail Images?

no code implementations5 Jan 2025 Takashi Harada, Takehiro Motomitsu, Katsuhiko Hayashi, Yusuke Sakai, Hidetaka Kamigaito

In recent years, there has been a notable increase in research on machine learning models for music retrieval and generation systems that are capable of taking natural language sentences as inputs.

Retrieval

Understanding the Impact of Confidence in Retrieval Augmented Generation: A Case Study in the Medical Domain

no code implementations29 Dec 2024 Shintaro Ozaki, Yuta Kato, Siyuan Feng, Masayo Tomita, Kazuki Hayashi, Ryoma Obara, Masafumi Oyamada, Katsuhiko Hayashi, Hidetaka Kamigaito, Taro Watanabe

Retrieval Augmented Generation (RAG) complements the knowledge of Large Language Models (LLMs) by leveraging external information to enhance response accuracy for queries.

RAG

Theoretical Aspects of Bias and Diversity in Minimum Bayes Risk Decoding

no code implementations19 Oct 2024 Hidetaka Kamigaito, Hiroyuki Deguchi, Yusuke Sakai, Katsuhiko Hayashi, Taro Watanabe

We also introduce a new MBR approach, Metric-augmented MBR (MAMBR), which increases diversity by adjusting the behavior of utility functions without altering the pseudo-references.

Diversity Text Generation

Towards Cross-Lingual Explanation of Artwork in Large-scale Vision Language Models

no code implementations3 Sep 2024 Shintaro Ozaki, Kazuki Hayashi, Yusuke Sakai, Hidetaka Kamigaito, Katsuhiko Hayashi, Taro Watanabe

As the performance of Large-scale Vision Language Models (LVLMs) improves, they are increasingly capable of responding in multiple languages, and there is an expectation that the demand for explanations generated by LVLMs will grow.

Machine Translation Translation

Multi-label Learning with Random Circular Vectors

1 code implementation8 Jul 2024 Ken Nishida, Kojiro Machi, Kazuma Onishi, Katsuhiko Hayashi, Hidetaka Kamigaito

The extreme multi-label classification~(XMC) task involves learning a classifier that can predict from a large label set the most relevant subset of labels for a data instance.

Extreme Multi-Label Classification MUlTI-LABEL-ClASSIFICATION +1

Unified Interpretation of Smoothing Methods for Negative Sampling Loss Functions in Knowledge Graph Embedding

1 code implementation5 Jul 2024 Xincan Feng, Hidetaka Kamigaito, Katsuhiko Hayashi, Taro Watanabe

This paper provides theoretical interpretations of the smoothing methods for the NS loss in KGE and induces a new NS loss, Triplet Adaptive Negative Sampling (TANS), that can cover the characteristics of the conventional smoothing methods.

Knowledge Graph Embedding Knowledge Graphs +1

Artwork Explanation in Large-scale Vision Language Models

no code implementations29 Feb 2024 Kazuki Hayashi, Yusuke Sakai, Hidetaka Kamigaito, Katsuhiko Hayashi, Taro Watanabe

To address this issue, we propose a new task: the artwork explanation generation task, along with its evaluation dataset and metric for quantitatively assessing the understanding and utilization of knowledge about artworks.

Explanation Generation Text Generation

IRR: Image Review Ranking Framework for Evaluating Vision-Language Models

no code implementations19 Feb 2024 Kazuki Hayashi, Kazuma Onishi, Toma Suzuki, Yusuke Ide, Seiji Gobara, Shigeki Saito, Yusuke Sakai, Hidetaka Kamigaito, Katsuhiko Hayashi, Taro Watanabe

We validate it using a dataset of images from 15 categories, each with five critic review texts and annotated rankings in both English and Japanese, totaling over 2, 000 data instances.

Diversity Image Captioning

Model-based Subsampling for Knowledge Graph Completion

1 code implementation17 Sep 2023 Xincan Feng, Hidetaka Kamigaito, Katsuhiko Hayashi, Taro Watanabe

Subsampling is effective in Knowledge Graph Embedding (KGE) for reducing overfitting caused by the sparsity in Knowledge Graph (KG) datasets.

Knowledge Graph Completion Knowledge Graph Embedding

Implicit ZCA Whitening Effects of Linear Autoencoders for Recommendation

no code implementations15 Aug 2023 Katsuhiko Hayashi, Kazuma Onishi

Recently, in the field of recommendation systems, linear regression (autoencoder) models have been investigated as a way to learn item similarity.

Recommendation Systems regression

Using Wikipedia Editor Information to Build High-performance Recommender Systems

no code implementations14 Jun 2023 Katsuhiko Hayashi

Wikipedia has high-quality articles on a variety of topics and has been used in diverse research areas.

Recommendation Systems

Table and Image Generation for Investigating Knowledge of Entities in Pre-trained Vision and Language Models

1 code implementation3 Jun 2023 Hidetaka Kamigaito, Katsuhiko Hayashi, Taro Watanabe

This task consists of two parts: the first is to generate a table containing knowledge about an entity and its related image, and the second is to generate an image from an entity with a caption and a table containing related knowledge of the entity.

Image Generation

Subsampling for Knowledge Graph Embedding Explained

no code implementations13 Sep 2022 Hidetaka Kamigaito, Katsuhiko Hayashi

In this article, we explain the recent advance of subsampling methods in knowledge graph embedding (KGE) starting from the original one used in word2vec.

Knowledge Graph Embedding

Comprehensive Analysis of Negative Sampling in Knowledge Graph Representation Learning

1 code implementation21 Jun 2022 Hidetaka Kamigaito, Katsuhiko Hayashi

To solve this problem, we theoretically analyzed NS loss to assist hyperparameter tuning and understand the better use of the NS loss in KGE learning.

Knowledge Graph Embedding

Why does Negative Sampling not Work Well? Analysis of Convexity in Negative Sampling

no code implementations29 Sep 2021 Hidetaka Kamigaito, Katsuhiko Hayashi

On the other hand, properties of the NS loss function that are considered important for learning, such as the relationship between the noise distribution and the number of negative samples, have not been investigated theoretically.

Computational Efficiency Knowledge Graph Embedding

Analyzing Word Embedding Through Structural Equation Modeling

no code implementations LREC 2020 Namgi Han, Katsuhiko Hayashi, Yusuke Miyao

Many researchers have tried to predict the accuracies of extrinsic evaluation by using intrinsic evaluation to evaluate word embedding.

Word Embeddings

Binarized Canonical Polyadic Decomposition for Knowledge Graph Completion

no code implementations4 Dec 2019 Koki Kishimoto, Katsuhiko Hayashi, Genki Akai, Masashi Shimbo

Methods based on vector embeddings of knowledge graphs have been actively pursued as a promising approach to knowledge graph completion. However, embedding models generate storage-inefficient representations, particularly when the number of entities and relations, and the dimensionality of the real-valued embedding vectors are large.

A Non-commutative Bilinear Model for Answering Path Queries in Knowledge Graphs

no code implementations IJCNLP 2019 Katsuhiko Hayashi, Masashi Shimbo

Although they perform well in predicting atomic relations, composite relations (relation paths) cannot be modeled naturally by the product of relation matrices, as the product of diagonal matrices is commutative and hence invariant with the order of relations.

Computational Efficiency Knowledge Graph Embedding +2

Binarized Knowledge Graph Embeddings

2 code implementations8 Feb 2019 Koki Kishimoto, Katsuhiko Hayashi, Genki Akai, Masashi Shimbo, Kazunori Komatani

This limitation is expected to become more stringent as existing knowledge graphs, which are already huge, keep steadily growing in scale.

Knowledge Graph Embeddings Quantization +1

Reduction of Parameter Redundancy in Biaffine Classifiers with Symmetric and Circulant Weight Matrices

1 code implementation PACLIC 2018 Tomoki Matsuno, Katsuhiko Hayashi, Takahiro Ishihara, Hitoshi Manabe, Yuji Matsumoto

Currently, the biaffine classifier has been attracting attention as a method to introduce an attention mechanism into the modeling of binary relations.

Dependency Parsing

Data-dependent Learning of Symmetric/Antisymmetric Relations for Knowledge Base Completion

no code implementations25 Aug 2018 Hitoshi Manabe, Katsuhiko Hayashi, Masashi Shimbo

Embedding-based methods for knowledge base completion (KBC) learn representations of entities and relations in a vector space, along with the scoring function to estimate the likelihood of relations between entities.

Knowledge Base Completion Relation

Higher-Order Syntactic Attention Network for Longer Sentence Compression

no code implementations NAACL 2018 Hidetaka Kamigaito, Katsuhiko Hayashi, Tsutomu Hirao, Masaaki Nagata

To solve this problem, we propose a higher-order syntactic attention network (HiSAN) that can handle higher-order dependency features as an attention distribution on LSTM hidden states.

Informativeness Machine Translation +2

Neural Tensor Networks with Diagonal Slice Matrices

no code implementations NAACL 2018 Takahiro Ishihara, Katsuhiko Hayashi, Hitoshi Manabe, Masashi Shimbo, Masaaki Nagata

Although neural tensor networks (NTNs) have been successful in many NLP tasks, they require a large number of parameters to be estimated, which often leads to overfitting and a long training time.

Knowledge Graph Completion Logical Reasoning +2

Hierarchical Word Structure-based Parsing: A Feasibility Study on UD-style Dependency Parsing in Japanese

no code implementations WS 2017 Takaaki Tanaka, Katsuhiko Hayashi, Masaaki Nagata

We introduce the following hierarchical word structures to dependency parsing in Japanese: morphological units (a short unit word, SUW) and syntactic units (a long unit word, LUW).

Chunking Dependency Parsing +2

K-best Iterative Viterbi Parsing

no code implementations EACL 2017 Katsuhiko Hayashi, Masaaki Nagata

This paper presents an efficient and optimal parsing algorithm for probabilistic context-free grammars (PCFGs).

On the Equivalence of Holographic and Complex Embeddings for Link Prediction

no code implementations ACL 2017 Katsuhiko Hayashi, Masashi Shimbo

We show the equivalence of two state-of-the-art link prediction/knowledge graph completion methods: Nickel et al's holographic embedding and Trouillon et al.'s complex embedding.

Knowledge Graph Completion Link Prediction

Efficient Stacked Dependency Parsing by Forest Reranking

no code implementations TACL 2013 Katsuhiko Hayashi, Shuhei Kondo, Yuji Matsumoto

This paper proposes a discriminative forest reranking algorithm for dependency parsing that can be seen as a form of efficient stacked parsing.

ARC Dependency Parsing

Cannot find the paper you are looking for? You can Submit a new open access paper.