Search Results for author: Kenji Yamanishi

Found 19 papers, 5 papers with code

Clustering Change Sign Detection by Fusing Mixture Complexity

1 code implementation27 Mar 2024 Kento Urano, Ryo Yuki, Kenji Yamanishi

In this paper, we propose MC fusion as an extension of MC to handle situations in which multiple mixture numbers are possible in a finite mixture model.

Clustering

Balancing Summarization and Change Detection in Graph Streams

1 code implementation30 Nov 2023 Shintaro Fukushima, Kenji Yamanishi

The parameter specifying the summary graph is then optimized so that the accuracy of change detection is guaranteed to suppress Type I error probability (probability of raising false alarms) to be less than a given confidence level.

Change Detection

Adaptive Topological Feature via Persistent Homology: Filtration Learning for Point Clouds

no code implementations NeurIPS 2023 Naoki Nishikawa, Yuichi Ike, Kenji Yamanishi

Machine learning for point clouds has been attracting much attention, with many applications in various fields, such as shape recognition and material science.

Tight and fast generalization error bound of graph embedding in metric space

no code implementations13 May 2023 Atsushi Suzuki, Atsushi Nitanda, Taiji Suzuki, Jing Wang, Feng Tian, Kenji Yamanishi

However, recent theoretical analyses have shown a much higher upper bound on non-Euclidean graph embedding's generalization error than Euclidean one's, where a high generalization error indicates that the incompleteness and noise in the data can significantly damage learning performance.

Graph Embedding

Detecting Signs of Model Change with Continuous Model Selection Based on Descriptive Dimensionality

no code implementations23 Feb 2023 Kenji Yamanishi, So Hirai

Continuous model selection is to determine the real-valued model dimensionality in terms of Ddim from a given data.

Descriptive Model Selection

Generalization Bounds for Graph Embedding Using Negative Sampling: Linear vs Hyperbolic

no code implementations NeurIPS 2021 Atsushi Suzuki, Atsushi Nitanda, Jing Wang, Linchuan Xu, Kenji Yamanishi, Marc Cavazza

Graph embedding, which represents real-world entities in a mathematical space, has enabled numerous applications such as analyzing natural languages, social networks, biochemical networks, and knowledge bases. It has been experimentally shown that graph embedding in hyperbolic space can represent hierarchical tree-like data more effectively than embedding in linear space, owing to hyperbolic space's exponential growth property.

Generalization Bounds Graph Embedding

Generalization Error Bound for Hyperbolic Ordinal Embedding

no code implementations21 May 2021 Atsushi Suzuki, Atsushi Nitanda, Jing Wang, Linchuan Xu, Marc Cavazza, Kenji Yamanishi

Hyperbolic ordinal embedding (HOE) represents entities as points in hyperbolic space so that they agree as well as possible with given constraints in the form of entity i is more similar to entity j than to entity k. It has been experimentally shown that HOE can obtain representations of hierarchical data such as a knowledge base and a citation network effectively, owing to hyperbolic space's exponential growth property.

Detecting Hierarchical Changes in Latent Variable Models

no code implementations18 Nov 2020 Shintaro Fukushima, Kenji Yamanishi

This paper addresses the issue of detecting hierarchical changes in latent variable models (HCDL) from data streams.

Change Detection

Word2vec Skip-gram Dimensionality Selection via Sequential Normalized Maximum Likelihood

no code implementations18 Aug 2020 Pham Thuc Hung, Kenji Yamanishi

Therefore, we apply information criteria with the aim of selecting the best dimensionality so that the corresponding model can be as close as possible to the true distribution.

Word Similarity

Online Robust and Adaptive Learning from Data Streams

1 code implementation23 Jul 2020 Shintaro Fukushima, Atsushi Nitanda, Kenji Yamanishi

We address the relation between the two parameters: one is the step size of the stochastic approximation, and the other is the threshold parameter of the norm of the stochastic update.

Attribute

Mixture Complexity and Its Application to Gradual Clustering Change Detection

1 code implementation15 Jul 2020 Shunki Kyoya, Kenji Yamanishi

Meanwhile, we consider the clustering changes to be gradual in terms of MC; it has the benefits of finding the changes earlier and discerning the significant and insignificant changes.

Change Detection Clustering +1

Descriptive Dimensionality and Its Characterization of MDL-based Learning and Change Detection

no code implementations25 Oct 2019 Kenji Yamanishi

The paper also derives error probabilities of the MDL-based test for multiple model change detection.

Change Detection Descriptive

Riemannian TransE: Multi-relational Graph Embedding in Non-Euclidean Space

no code implementations ICLR 2019 Atsushi Suzuki, Yosuke Enokida, Kenji Yamanishi

Multi-relational graph embedding which aims at achieving effective representations with reduced low-dimensional parameters, has been widely used in knowledge base completion.

Graph Embedding Knowledge Base Completion +1

Adaptive Minimax Regret against Smooth Logarithmic Losses over High-Dimensional $\ell_1$-Balls via Envelope Complexity

no code implementations9 Oct 2018 Kohei Miyaguchi, Kenji Yamanishi

The resulting regret bound is so simple that it is completely determined with the smoothness of the loss function and the radius of the balls except with logarithmic factors, and it has a generalized form of existing regret/risk bounds.

Stable Geodesic Update on Hyperbolic Space and its Application to Poincare Embeddings

no code implementations26 May 2018 Yosuke Enokida, Atsushi Suzuki, Kenji Yamanishi

A hyperbolic space has been shown to be more capable of modeling complex networks than a Euclidean space.

High-dimensional Penalty Selection via Minimum Description Length Principle

no code implementations26 Apr 2018 Kohei Miyaguchi, Kenji Yamanishi

In this situation, the luckiness-normalized-maximum-likelihood(LNML)-minimization approach is favorable, because LNML quantifies the goodness of regularized models with any forms of penalty functions in view of the minimum description length principle, and guides us to a good penalty function through the high-dimensional space.

Vocal Bursts Intensity Prediction

Grafting for Combinatorial Boolean Model using Frequent Itemset Mining

1 code implementation7 Nov 2017 Taito Lee, Shin Matsushima, Kenji Yamanishi

To overcome this computational difficulty, we propose an algorithm GRAB (GRAfting for Boolean datasets), which efficiently learns CBM within the $L_1$-regularized loss minimization framework.

Computational Efficiency

Predicting Glaucoma Visual Field Loss by Hierarchically Aggregating Clustering-based Predictors

no code implementations23 Mar 2016 Motohide Higaki, Kai Morino, Hiroshi Murata, Ryo Asaoka, Kenji Yamanishi

Thus, we propose a method for aggregating cluster-based predictors to obtain better prediction accuracy than from a single cluster-based prediction.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.