Search Results for author: Akira Nakagawa

Found 4 papers, 0 papers with code

Toward Unlimited Self-Learning MCMC with Parallel Adaptive Annealing

no code implementations25 Nov 2022 Yuma Ichikawa, Akira Nakagawa, Hiromoto Masayuki, Yuhei Umeda

However, SLMC methods are difficult to directly apply to multimodal distributions for which training data are difficult to obtain.

Self-Learning

Quantitative Understanding of VAE as a Non-linearly Scaled Isometric Embedding

no code implementations30 Jul 2020 Akira Nakagawa, Keizo Kato, Taiji Suzuki

According to the Rate-distortion theory, the optimal transform coding is achieved by using an orthonormal transform with PCA basis where the transform space is isometric to the input.

Rate-Distortion Optimization Guided Autoencoder for Isometric Embedding in Euclidean Latent Space

no code implementations ICML 2020 Keizo Kato, Jing Zhou, Tomotake Sasaki, Akira Nakagawa

We show our method has the following properties: (i) the Jacobian matrix between the input space and a Euclidean latent space forms a constantlyscaled orthonormal system and enables isometric data embedding; (ii) the relation of PDFs in both spaces can become tractable one such as proportional relation.

Relation Unsupervised Anomaly Detection

RATE-DISTORTION OPTIMIZATION GUIDED AUTOENCODER FOR GENERATIVE APPROACH

no code implementations25 Sep 2019 Keizo Kato, Jing Zhou, Akira Nakagawa

In the generative model approach of machine learning, it is essential to acquire an accurate probabilistic model and compress the dimension of data for easy treatment.

Unsupervised Anomaly Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.