Search Results for author: Futoshi Futami

Found 8 papers, 1 papers with code

Accelerating the diffusion-based ensemble sampling by non-reversible dynamics

no code implementations ICML 2020 Futoshi Futami, Issei Sato, Masashi Sugiyama

Compared with the naive parallel-chain SGLD that updates multiple particles independently, ensemble methods update particles with their interactions.

Bayesian Inference

Information-theoretic Analysis of Test Data Sensitivity in Uncertainty

no code implementations23 Jul 2023 Futoshi Futami, Tomoharu Iwata

Furthermore, we extend the existing analysis of Bayesian meta-learning and show the novel sensitivities among tasks for the first time.

Bayesian Inference Meta-Learning +1

Variational Inference based on Robust Divergences

1 code implementation18 Oct 2017 Futoshi Futami, Issei Sato, Masashi Sugiyama

In this paper, based on Zellner's optimization and variational formulation of Bayesian inference, we propose an outlier-robust pseudo-Bayesian variational method by replacing the Kullback-Leibler divergence used for data fitting to a robust divergence such as the beta- and gamma-divergences.

Bayesian Inference Variational Inference

Expectation Propagation for t-Exponential Family Using Q-Algebra

no code implementations NeurIPS 2017 Futoshi Futami, Issei Sato, Masashi Sugiyama

Exponential family distributions are highly useful in machine learning since their calculation can be performed efficiently through natural parameters.

Cannot find the paper you are looking for? You can Submit a new open access paper.