1 code implementation • 27 Mar 2024 • Kento Urano, Ryo Yuki, Kenji Yamanishi
In this paper, we propose MC fusion as an extension of MC to handle situations in which multiple mixture numbers are possible in a finite mixture model.
1 code implementation • 30 Nov 2023 • Shintaro Fukushima, Kenji Yamanishi
The parameter specifying the summary graph is then optimized so that the accuracy of change detection is guaranteed to suppress Type I error probability (probability of raising false alarms) to be less than a given confidence level.
no code implementations • NeurIPS 2023 • Naoki Nishikawa, Yuichi Ike, Kenji Yamanishi
Machine learning for point clouds has been attracting much attention, with many applications in various fields, such as shape recognition and material science.
no code implementations • 13 May 2023 • Atsushi Suzuki, Atsushi Nitanda, Taiji Suzuki, Jing Wang, Feng Tian, Kenji Yamanishi
However, recent theoretical analyses have shown a much higher upper bound on non-Euclidean graph embedding's generalization error than Euclidean one's, where a high generalization error indicates that the incompleteness and noise in the data can significantly damage learning performance.
no code implementations • 23 Feb 2023 • Kenji Yamanishi, So Hirai
Continuous model selection is to determine the real-valued model dimensionality in terms of Ddim from a given data.
no code implementations • NeurIPS 2021 • Atsushi Suzuki, Atsushi Nitanda, Jing Wang, Linchuan Xu, Kenji Yamanishi, Marc Cavazza
Graph embedding, which represents real-world entities in a mathematical space, has enabled numerous applications such as analyzing natural languages, social networks, biochemical networks, and knowledge bases. It has been experimentally shown that graph embedding in hyperbolic space can represent hierarchical tree-like data more effectively than embedding in linear space, owing to hyperbolic space's exponential growth property.
no code implementations • 21 May 2021 • Atsushi Suzuki, Atsushi Nitanda, Jing Wang, Linchuan Xu, Marc Cavazza, Kenji Yamanishi
Hyperbolic ordinal embedding (HOE) represents entities as points in hyperbolic space so that they agree as well as possible with given constraints in the form of entity i is more similar to entity j than to entity k. It has been experimentally shown that HOE can obtain representations of hierarchical data such as a knowledge base and a citation network effectively, owing to hyperbolic space's exponential growth property.
no code implementations • 18 Nov 2020 • Shintaro Fukushima, Kenji Yamanishi
This paper addresses the issue of detecting hierarchical changes in latent variable models (HCDL) from data streams.
no code implementations • 18 Aug 2020 • Pham Thuc Hung, Kenji Yamanishi
Therefore, we apply information criteria with the aim of selecting the best dimensionality so that the corresponding model can be as close as possible to the true distribution.
no code implementations • 31 Jul 2020 • Linchuan Xu, Jun Huang, Atsushi Nitanda, Ryo Asaoka, Kenji Yamanishi
In this paper, we thus propose a novel global spatial attention mechanism in CNNs mainly for medical image classification.
1 code implementation • 23 Jul 2020 • Shintaro Fukushima, Atsushi Nitanda, Kenji Yamanishi
We address the relation between the two parameters: one is the step size of the stochastic approximation, and the other is the threshold parameter of the norm of the stochastic update.
1 code implementation • 15 Jul 2020 • Shunki Kyoya, Kenji Yamanishi
Meanwhile, we consider the clustering changes to be gradual in terms of MC; it has the benefits of finding the changes earlier and discerning the significant and insignificant changes.
no code implementations • 25 Oct 2019 • Kenji Yamanishi
The paper also derives error probabilities of the MDL-based test for multiple model change detection.
no code implementations • ICLR 2019 • Atsushi Suzuki, Yosuke Enokida, Kenji Yamanishi
Multi-relational graph embedding which aims at achieving effective representations with reduced low-dimensional parameters, has been widely used in knowledge base completion.
no code implementations • 9 Oct 2018 • Kohei Miyaguchi, Kenji Yamanishi
The resulting regret bound is so simple that it is completely determined with the smoothness of the loss function and the radius of the balls except with logarithmic factors, and it has a generalized form of existing regret/risk bounds.
no code implementations • 26 May 2018 • Yosuke Enokida, Atsushi Suzuki, Kenji Yamanishi
A hyperbolic space has been shown to be more capable of modeling complex networks than a Euclidean space.
no code implementations • 26 Apr 2018 • Kohei Miyaguchi, Kenji Yamanishi
In this situation, the luckiness-normalized-maximum-likelihood(LNML)-minimization approach is favorable, because LNML quantifies the goodness of regularized models with any forms of penalty functions in view of the minimum description length principle, and guides us to a good penalty function through the high-dimensional space.
1 code implementation • 7 Nov 2017 • Taito Lee, Shin Matsushima, Kenji Yamanishi
To overcome this computational difficulty, we propose an algorithm GRAB (GRAfting for Boolean datasets), which efficiently learns CBM within the $L_1$-regularized loss minimization framework.
no code implementations • 23 Mar 2016 • Motohide Higaki, Kai Morino, Hiroshi Murata, Ryo Asaoka, Kenji Yamanishi
Thus, we propose a method for aggregating cluster-based predictors to obtain better prediction accuracy than from a single cluster-based prediction.