Search Results for author: Takafumi Kanamori

Found 17 papers, 4 papers with code

Deep Clustering with a Constraint for Topological Invariance based on Symmetric InfoNCE

no code implementations6 Mar 2023 Yuhui Zhang, Yuichiro Wada, Hiroki Waida, Kaito Goto, Yusaku Hino, Takafumi Kanamori

To address the problem, we propose a constraint utilizing symmetric InfoNCE, which helps an objective of deep clustering method in the scenario train the model so as to be efficient for not only non-complex topology but also complex topology datasets.

Deep Clustering

Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization

1 code implementation9 Jun 2021 Léo Andéol, Yusei Kawakami, Yuichiro Wada, Takafumi Kanamori, Klaus-Robert Müller, Grégoire Montavon

Domain shifts in the training data are common in practical applications of machine learning, they occur for instance when the data is coming from different sources.

Bayesian Neural Networks with Variance Propagation for Uncertainty Evaluation

no code implementations1 Jan 2021 Yuki Mae, Wataru Kumagai, Takafumi Kanamori

We report the computational efficiency and statistical reliability of our method in numerical experiments of the language modeling using RNNs, and the out-of-distribution detection with DNNs.

Bayesian Inference Language Modelling +1

Robust modal regression with direct log-density derivative estimation

no code implementations18 Oct 2019 Hiroaki Sasaki, Tomoya Sakai, Takafumi Kanamori

In order to apply a gradient method for the maximization, the fundamental challenge is accurate approximation of the gradient of MRR, not MRR itself.

Density Estimation regression

Fisher Efficient Inference of Intractable Models

1 code implementation NeurIPS 2019 Song Liu, Takafumi Kanamori, Wittawat Jitkrittum, Yu Chen

For example, the asymptotic variance of MLE solution attains equality of the asymptotic Cram{\'e}r-Rao lower bound (efficiency bound), which is the minimum possible variance for an unbiased estimator.

Density Ratio Estimation

Mode-Seeking Clustering and Density Ridge Estimation via Direct Estimation of Density-Derivative-Ratios

no code implementations6 Jul 2017 Hiroaki Sasaki, Takafumi Kanamori, Aapo Hyvärinen, Gang Niu, Masashi Sugiyama

Based on the proposed estimator, novel methods both for mode-seeking clustering and density ridge estimation are developed, and the respective convergence rates to the mode and ridge of the underlying density are also established.

Density Estimation

Empirical Localization of Homogeneous Divergences on Discrete Sample Spaces

no code implementations NeurIPS 2015 Takashi Takenouchi, Takafumi Kanamori

In this paper, we propose a novel parameter estimator for probabilistic models on discrete space.

Parallel Distributed Block Coordinate Descent Methods based on Pairwise Comparison Oracle

no code implementations13 Sep 2014 Kota Matsui, Wataru Kumagai, Takafumi Kanamori

Our algorithm consists of two steps; one is the direction estimate step and the other is the search step.

Breakdown Point of Robust Support Vector Machine

no code implementations3 Sep 2014 Takafumi Kanamori, Shuhei Fujiwara, Akiko Takeda

For learning parameters such as the regularization parameter in our algorithm, we derive a simple formula that guarantees the robustness of the classifier.

Outlier Detection

Affine Invariant Divergences associated with Composite Scores and its Applications

no code implementations11 May 2013 Takafumi Kanamori, Hironori Fujisawa

By using the equivariant estimators under the affine transformation, one can obtain estimators that do no essentially depend on the choice of the system of units in the measurement.

Density-Difference Estimation

no code implementations NeurIPS 2012 Masashi Sugiyama, Takafumi Kanamori, Taiji Suzuki, Marthinus D. Plessis, Song Liu, Ichiro Takeuchi

A naive approach is a two-step procedure of first estimating two densities separately and then computing their difference.

Change Point Detection

Relative Density-Ratio Estimation for Robust Distribution Comparison

no code implementations NeurIPS 2011 Makoto Yamada, Taiji Suzuki, Takafumi Kanamori, Hirotaka Hachiya, Masashi Sugiyama

Divergence estimators based on direct approximation of density-ratios without going through separate approximation of numerator and denominator densities have been successfully applied to machine learning tasks that involve distribution comparison such as outlier detection, transfer learning, and two-sample homogeneity test.

Density Ratio Estimation Outlier Detection +1

Condition Number Analysis of Kernel-based Density Ratio Estimation

1 code implementation15 Dec 2009 Takafumi Kanamori, Taiji Suzuki, Masashi Sugiyama

We show that the kernel least-squares method has a smaller condition number than a version of kernel mean matching and other M-estimators, implying that the kernel least-squares method has preferable numerical properties.

Density Ratio Estimation Outlier Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.