Search Results for author: Shotaro Akaho

Found 10 papers, 0 papers with code

Geometry of EM and related iterative algorithms

no code implementations3 Sep 2022 Hideitsu Hino, Shotaro Akaho, Noboru Murata

The Expectation--Maximization (EM) algorithm is a simple meta-algorithm that has been used for many years as a methodology for statistical inference when there are missing measurements in the observed data or when the data is composed of observables and unobservables.

Full-Span Log-Linear Model and Fast Learning Algorithm

no code implementations17 Feb 2022 Kazuya Takabatake, Shotaro Akaho

The FSLL model is a "highest-order" Boltzmann machine; nevertheless, we can compute the dual parameters of the model distribution, which plays important roles in exponential families, in $O(|X|\log|X|)$ time.

Principal component analysis for Gaussian process posteriors

no code implementations15 Jul 2021 Hideaki Ishibashi, Shotaro Akaho

This paper proposes an extension of principal component analysis for Gaussian process (GP) posteriors, denoted by GP-PCA.

Meta-Learning Variational Inference

Reconsidering Dependency Networks from an Information Geometry Perspective

no code implementations2 Jul 2021 Kazuya Takabatake, Shotaro Akaho

Like Bayesian networks, the structure of a dependency network is represented by a directed graph, and each node has a conditional probability table.

Pathological spectra of the Fisher information metric and its variants in deep neural networks

no code implementations14 Oct 2019 Ryo Karakida, Shotaro Akaho, Shun-ichi Amari

The Fisher information matrix (FIM) plays an essential role in statistics and machine learning as a Riemannian metric tensor or a component of the Hessian matrix of loss functions.

On a convergence property of a geometrical algorithm for statistical manifolds

no code implementations27 Sep 2019 Shotaro Akaho, Hideitsu Hino, Noboru Murata

In this paper, we examine a geometrical projection algorithm for statistical inference.

Relation

The Normalization Method for Alleviating Pathological Sharpness in Wide Neural Networks

no code implementations NeurIPS 2019 Ryo Karakida, Shotaro Akaho, Shun-ichi Amari

Thus, we can conclude that batch normalization in the last layer significantly contributes to decreasing the sharpness induced by the FIM.

Universal Statistics of Fisher Information in Deep Neural Networks: Mean Field Approach

no code implementations4 Jun 2018 Ryo Karakida, Shotaro Akaho, Shun-ichi Amari

The Fisher information matrix (FIM) is a fundamental quantity to represent the characteristics of a stochastic model, including deep neural networks (DNNs).

A kernel method for canonical correlation analysis

no code implementations13 Sep 2006 Shotaro Akaho

Canonical correlation analysis is a technique to extract common features from a pair of multivariate data.

Cannot find the paper you are looking for? You can Submit a new open access paper.