no code implementations • 6 Jan 2021 • Hiroaki Sasaki, Takashi Takenouchi

Then, we propose a practical method through outlier-robust density ratio estimation, which can be seen as performing maximization of MI, nonlinear ICA or nonlinear subspace estimation.

no code implementations • 1 Nov 2019 • Hiroaki Sasaki, Takashi Takenouchi, Ricardo Monti, Aapo Hyvärinen

We develop two robust nonlinear ICA methods based on the {\gamma}-divergence, which is a robust alternative to the KL-divergence in logistic regression.

no code implementations • 18 Oct 2019 • Hiroaki Sasaki, Tomoya Sakai, Takafumi Kanamori

In order to apply a gradient method for the maximization, the fundamental challenge is accurate approximation of the gradient of MRR, not MRR itself.

no code implementations • 5 Jun 2018 • Hiroaki Sasaki, Aapo Hyvärinen

Among existing methods, non-parametric and/or kernel-based methods are often difficult to use on large datasets, while methods based on neural networks usually make restrictive parametric assumptions on the probability densities.

1 code implementation • 22 May 2018 • Aapo Hyvarinen, Hiroaki Sasaki, Richard E. Turner

Here, we propose a general framework for nonlinear ICA, which, as a special case, can make use of temporal structure.

no code implementations • 6 Jul 2017 • Hiroaki Sasaki, Takafumi Kanamori, Aapo Hyvärinen, Gang Niu, Masashi Sugiyama

Based on the proposed estimator, novel methods both for mode-seeking clustering and density ridge estimation are developed, and the respective convergence rates to the mode and ridge of the underlying density are also established.

1 code implementation • 3 Mar 2016 • Hiroaki Shiino, Hiroaki Sasaki, Gang Niu, Masashi Sugiyama

Non-Gaussian component analysis (NGCA) is an unsupervised linear dimension reduction method that extracts low-dimensional non-Gaussian "signals" from high-dimensional data contaminated with Gaussian noise.

no code implementations • 28 Jan 2016 • Hiroaki Sasaki, Gang Niu, Masashi Sugiyama

Non-Gaussian component analysis (NGCA) is aimed at identifying a linear subspace such that the projected data follows a non-Gaussian distribution.

no code implementations • 5 Aug 2015 • Voot Tangkaratt, Hiroaki Sasaki, Masashi Sugiyama

On the other hand, quadratic MI (QMI) is a variant of MI based on the $L_2$ distance which is more robust against outliers than the KL divergence, and a computationally efficient method to estimate QMI from data, called least-squares QMI (LSQMI), has been proposed recently.

no code implementations • 1 Aug 2015 • Ikko Yamane, Hiroaki Sasaki, Masashi Sugiyama

Log-density gradient estimation is a fundamental statistical problem and possesses various practical applications such as clustering and measuring non-Gaussianity.

no code implementations • 18 Jun 2015 • Hiroaki Sasaki, Michael U. Gutmann, Hayaru Shouno, Aapo Hyvärinen

The precision matrix of the linear components is assumed to be randomly generated by a higher-order process and explicitly parametrized by a parameter matrix.

no code implementations • 30 Jun 2014 • Hiroaki Sasaki, Yung-Kyun Noh, Masashi Sugiyama

Estimation of density derivatives is a versatile tool in statistical data analysis.

no code implementations • 20 Apr 2014 • Hiroaki Sasaki, Aapo Hyvärinen, Masashi Sugiyama

We then develop a mean-shift-like fixed-point algorithm to find the modes of the density for clustering.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.