Search Results for author: Motonobu Kanagawa

Found 16 papers, 2 papers with code

Variable Selection in Maximum Mean Discrepancy for Interpretable Distribution Comparison

no code implementations2 Nov 2023 Kensuke Mitsuzawa, Motonobu Kanagawa, Stefano Bortoli, Margherita Grossi, Paolo Papotti

For this optimisation, we introduce sparse regularisation and propose two methods for dealing with the issue of selecting an appropriate regularisation parameter.

Causal Inference Two-sample testing +1

Improved Random Features for Dot Product Kernels

no code implementations21 Jan 2022 Jonas Wacker, Motonobu Kanagawa, Maurizio Filippone

These variance formulas elucidate conditions under which certain approximations (e. g., TensorSRHT) achieve lower variances than others (e. g., Rademacher sketches), and conditions under which the use of complex features leads to lower variances than real features.

Recommendation Systems

Intergenerational risk sharing in a Defined Contribution pension system: analysis with Bayesian optimization

no code implementations25 Jun 2021 An Chen, Motonobu Kanagawa, Fangyuan Zhang

We study a fully funded, collective defined-contribution (DC) pension system with multiple overlapping generations.

Bayesian Optimization

Connections and Equivalences between the Nyström Method and Sparse Variational Gaussian Processes

no code implementations2 Jun 2021 Veit Wild, Motonobu Kanagawa, Dino Sejdinovic

We investigate the connections between sparse approximation methods for making kernel methods and Gaussian processes (GPs) scalable to large-scale data, focusing on the Nystr\"om method and the Sparse Variational Gaussian Processes (SVGP).

Gaussian Processes

Convergence Guarantees for Adaptive Bayesian Quadrature Methods

no code implementations NeurIPS 2019 Motonobu Kanagawa, Philipp Hennig

Adaptive Bayesian quadrature (ABQ) is a powerful approach to numerical integration that empirically compares favorably with Monte Carlo integration on problems of medium dimensionality (where non-adaptive quadrature is not competitive).

Numerical Integration

Simulator Calibration under Covariate Shift with Kernels

no code implementations21 Sep 2018 Keiichi Kisamori, Motonobu Kanagawa, Keisuke Yamazaki

We propose a novel calibration method for computer simulators, dealing with the problem of covariate shift.

Bayesian Inference

Gaussian Processes and Kernel Methods: A Review on Connections and Equivalences

no code implementations6 Jul 2018 Motonobu Kanagawa, Philipp Hennig, Dino Sejdinovic, Bharath K. Sriperumbudur

This paper is an attempt to bridge the conceptual gaps between researchers working on the two widely used approaches based on positive definite kernels: Bayesian learning or inference using Gaussian processes on the one side, and frequentist kernel methods based on reproducing kernel Hilbert spaces on the other.

Gaussian Processes regression

Counterfactual Mean Embeddings

no code implementations22 May 2018 Krikamol Muandet, Motonobu Kanagawa, Sorawit Saengkyongam, Sanparith Marukatat

In this work, we propose to model counterfactual distributions using a novel Hilbert space representation called counterfactual mean embedding (CME).

counterfactual Counterfactual Inference +4

Kernel Recursive ABC: Point Estimation with Intractable Likelihood

no code implementations ICML 2018 Takafumi Kajihara, Motonobu Kanagawa, Keisuke Yamazaki, Kenji Fukumizu

We propose a novel approach to parameter estimation for simulator-based statistical models with intractable likelihood.

Convergence Analysis of Deterministic Kernel-Based Quadrature Rules in Misspecified Settings

1 code implementation1 Sep 2017 Motonobu Kanagawa, Bharath K. Sriperumbudur, Kenji Fukumizu

This paper presents a convergence analysis of kernel-based quadrature rules in misspecified settings, focusing on deterministic quadrature in Sobolev spaces.

Large sample analysis of the median heuristic

1 code implementation23 Jul 2017 Damien Garreau, Wittawat Jitkrittum, Motonobu Kanagawa

In kernel methods, the median heuristic has been widely used as a way of setting the bandwidth of RBF kernels.

Convergence guarantees for kernel-based quadrature rules in misspecified settings

no code implementations NeurIPS 2016 Motonobu Kanagawa, Bharath K. Sriperumbudur, Kenji Fukumizu

Kernel-based quadrature rules are becoming important in machine learning and statistics, as they achieve super-$\sqrt{n}$ convergence rates in numerical integration, and thus provide alternatives to Monte Carlo integration in challenging settings where integrands are expensive to evaluate or where integrands are high dimensional.

Numerical Integration

Model-based Kernel Sum Rule: Kernel Bayesian Inference with Probabilistic Models

no code implementations18 Sep 2014 Yu Nishiyama, Motonobu Kanagawa, Arthur Gretton, Kenji Fukumizu

Our contribution in this paper is to introduce a novel approach, termed the {\em model-based kernel sum rule} (Mb-KSR), to combine a probabilistic model and kernel Bayesian inference.

Bayesian Inference

Filtering with State-Observation Examples via Kernel Monte Carlo Filter

no code implementations17 Dec 2013 Motonobu Kanagawa, Yu Nishiyama, Arthur Gretton, Kenji Fukumizu

In particular, the sampling and resampling procedures are novel in being expressed using kernel mean embeddings, so we theoretically analyze their behaviors.

Cannot find the paper you are looking for? You can Submit a new open access paper.