You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 20 Jul 2022 • Francesco Quinzan, Cecilia Casolo, Krikamol Muandet, Niki Kilbertus, Yucen Luo

We propose a method to learn predictors that are invariant under counterfactual changes of certain covariates.

1 code implementation • 11 Jul 2022 • Heiner Kremer, Jia-Jie Zhu, Krikamol Muandet, Bernhard Schölkopf

Important problems in causal inference, economics, and, more generally, robust machine learning can be expressed as conditional moment restrictions, but estimation becomes challenging as it requires solving a continuum of unconditional moment restrictions.

no code implementations • 24 Jun 2022 • Simon Föll, Alina Dubatovka, Eugen Ernst, Martin Maritsch, Patrik Okanovic, Gudrun Thäter, Joachim M. Buhmann, Felix Wortmann, Krikamol Muandet

assumption in real-world data distributions.

2 code implementations • 17 Jun 2022 • Jonas M. Kübler, Vincent Stimper, Simon Buchholz, Krikamol Muandet, Bernhard Schölkopf

Two-sample tests are important in statistics and machine learning, both as tools for scientific discovery as well as to detect distribution shifts.

no code implementations • 5 Jun 2022 • Krikamol Muandet

Democratization of AI involves training and deploying machine learning models across heterogeneous and potentially massive environments.

1 code implementation • 7 Jun 2021 • Rui Zhang, Krikamol Muandet, Bernhard Schölkopf, Masaaki Imaizumi

Kernel maximum moment restriction (KMMR) recently emerges as a popular framework for instrumental variable (IV) based conditional moment restriction (CMR) models with important applications in conditional moment (CM) testing and parameter estimation for IV regression and proximal causal learning.

1 code implementation • 10 May 2021 • Afsaneh Mastouri, Yuchen Zhu, Limor Gultchin, Anna Korba, Ricardo Silva, Matt J. Kusner, Arthur Gretton, Krikamol Muandet

In particular, we provide a unifying view of two-stage and moment restriction approaches for solving this problem in a nonlinear setting.

no code implementations • 16 Feb 2021 • Junhyung Park, Uri Shalit, Bernhard Schölkopf, Krikamol Muandet

We propose to analyse the conditional distributional treatment effect (CoDiTE), which, in contrast to the more common conditional average treatment effect (CATE), is designed to encode a treatment's distributional aspects beyond the mean.

1 code implementation • 10 Feb 2021 • Jonas M. Kübler, Wittawat Jitkrittum, Bernhard Schölkopf, Krikamol Muandet

That is, the test set is used to simultaneously estimate the expectations and define the basis points, while the training set only serves to select the kernel and is discarded.

1 code implementation • NeurIPS 2020 • Xiaohan Chen, Zhangyang Wang, Siyu Tang, Krikamol Muandet

Meta-learning improves generalization of machine learning models when faced with previously unseen tasks by leveraging experiences from different, yet related prior tasks.

no code implementations • 21 Oct 2020 • Junhyunng Park, Krikamol Muandet

This short technical report presents some learning theory results on vector-valued reproducing kernel Hilbert space (RKHS) regression, where the input space is allowed to be non-compact and the output space is a (possibly infinite-dimensional) Hilbert space.

1 code implementation • 15 Oct 2020 • Rui Zhang, Masaaki Imaizumi, Bernhard Schölkopf, Krikamol Muandet

We propose a simple framework for nonlinear instrumental variable (IV) regression based on a kernelized conditional moment restriction (CMR) known as a maximum moment restriction (MMR).

2 code implementations • 10 Aug 2020 • Korrawe Karunratanakul, Jinlong Yang, Yan Zhang, Michael Black, Krikamol Muandet, Siyu Tang

Specifically, our generative model is able to synthesize high-quality human grasps, given only on a 3D object point cloud.

1 code implementation • NeurIPS 2020 • Jonas M. Kübler, Wittawat Jitkrittum, Bernhard Schölkopf, Krikamol Muandet

Modern large-scale kernel-based tests such as maximum mean discrepancy (MMD) and kernelized Stein discrepancy (KSD) optimize kernel hyperparameters on a held-out sample via data splitting to obtain the most powerful test statistics.

1 code implementation • 21 Feb 2020 • Krikamol Muandet, Wittawat Jitkrittum, Jonas Kübler

We propose a new family of specification tests called kernel conditional moment (KCM) tests.

no code implementations • NeurIPS 2020 • Jun-Hyung Park, Krikamol Muandet

We present an operator-free, measure-theoretic approach to the conditional mean embedding (CME) as a random variable taking values in a reproducing kernel Hilbert space.

1 code implementation • 25 Nov 2019 • Jia-Jie Zhu, Krikamol Muandet, Moritz Diehl, Bernhard Schölkopf

This work presents the concept of kernel mean embedding and kernel probabilistic programming in the context of stochastic systems.

no code implementations • 29 Oct 2019 • Arash Mehrjou, Wittawat Jitkrittum, Krikamol Muandet, Bernhard Schölkopf

Modern implicit generative models such as generative adversarial networks (GANs) are generally known to suffer from issues such as instability, uninterpretability, and difficulty in assessing their performance.

1 code implementation • NeurIPS 2020 • Krikamol Muandet, Arash Mehrjou, Si Kai Lee, Anant Raj

We present a novel algorithm for non-linear instrumental variable (IV) regression, DualIV, which simplifies traditional two-stage methods via a dual formulation.

1 code implementation • 3 Jun 2019 • Yan Zhang, Krikamol Muandet, Qianli Ma, Heiko Neumann, Siyu Tang

In this paper, we propose an approach to representing high-order information for temporal action segmentation via a simple yet effective bilinear form.

no code implementations • 31 May 2019 • Jonas M. Kübler, Krikamol Muandet, Bernhard Schölkopf

The kernel mean embedding of probability distributions is commonly used in machine learning as an injective mapping from distributions to functions in an infinite dimensional Hilbert space.

no code implementations • 29 May 2019 • Si Kai Lee, Luigi Gresele, Mijung Park, Krikamol Muandet

The use of inverse probability weighting (IPW) methods to estimate the causal effect of treatments from observational studies is widespread in econometrics, medicine and social sciences.

no code implementations • 27 May 2019 • Ingmar Schuster, Mattes Mollenhauer, Stefan Klus, Krikamol Muandet

The proposed model is based on a novel approach to the reconstruction of probability densities from their kernel mean embeddings by drawing connections to estimation of Radon-Nikodym derivatives in the reproducing kernel Hilbert space (RKHS).

1 code implementation • 8 Feb 2019 • Niki Kilbertus, Manuel Gomez-Rodriguez, Bernhard Schölkopf, Krikamol Muandet, Isabel Valera

In this paper, we show that in this selective labels setting, learning a predictor directly only from available labeled data is suboptimal in terms of both fairness and utility.

no code implementations • 26 Jan 2019 • Arash Mehrjou, Wittawat Jitkrittum, Krikamol Muandet, Bernhard Schölkopf

Modern implicit generative models such as generative adversarial networks (GANs) are generally known to suffer from issues such as instability, uninterpretability, and difficulty in assessing their performance.

1 code implementation • CVPR 2019 • Yan Zhang, Siyu Tang, Krikamol Muandet, Christian Jarvers, Heiko Neumann

Fine-grained temporal action parsing is important in many applications, such as daily activity understanding, human motion analysis, surgical robotics and others requiring subtle and precise operations in a long-term period.

no code implementations • 22 May 2018 • Krikamol Muandet, Motonobu Kanagawa, Sorawit Saengkyongam, Sanparith Marukatat

In this work, we propose to model counterfactual distributions using a novel Hilbert space representation called counterfactual mean embedding (CME).

1 code implementation • 5 Dec 2017 • Stefan Klus, Ingmar Schuster, Krikamol Muandet

Transfer operators such as the Perron--Frobenius or Koopman operator play an important role in the global analysis of complex dynamical systems.

1 code implementation • 31 Aug 2017 • Nihar B. Shah, Behzad Tabibian, Krikamol Muandet, Isabelle Guyon, Ulrike Von Luxburg

Neural Information Processing Systems (NIPS) is a top-tier annual conference in machine learning.

no code implementations • 31 May 2016 • Krikamol Muandet, Kenji Fukumizu, Bharath Sriperumbudur, Bernhard Schölkopf

Next, we discuss the Hilbert space embedding for conditional distributions, give theoretical insights, and review some applications.

1 code implementation • 9 Feb 2015 • David Lopez-Paz, Krikamol Muandet, Bernhard Schölkopf, Ilya Tolstikhin

We pose causal inference as the problem of learning to classify probability distributions.

no code implementations • 27 Jan 2015 • Bernhard Schölkopf, Krikamol Muandet, Kenji Fukumizu, Jonas Peters

We describe a method to perform functional operations on probability distributions of random variables.

no code implementations • NeurIPS 2014 • Krikamol Muandet, Bharath Sriperumbudur, Bernhard Schölkopf

The problem of estimating the kernel mean in a reproducing kernel Hilbert space (RKHS) is central to kernel methods in that it is used by classical approaches (e. g., when centering a kernel PCA matrix), and it also forms the core inference step of modern kernel methods (e. g., kernel-based non-parametric tests) that rely on embedding probability distributions in RKHSs.

no code implementations • 15 Sep 2014 • David Lopez-Paz, Krikamol Muandet, Benjamin Recht

We are interested in learning causal relationships between pairs of random variables, purely from observational data.

no code implementations • 9 Aug 2014 • Krikamol Muandet, Bernhard Schoelkopf

We propose one-class support measure machines (OCSMMs) for group anomaly detection which aims at recognizing anomalous aggregate behaviors of data points.

no code implementations • 21 May 2014 • Krikamol Muandet, Bharath Sriperumbudur, Kenji Fukumizu, Arthur Gretton, Bernhard Schölkopf

A mean function in a reproducing kernel Hilbert space (RKHS), or a kernel mean, is central to kernel methods in that it is used by many classical algorithms such as kernel principal component analysis, and it also forms the core inference step of modern kernel methods that rely on embedding probability distributions in RKHSs.

1 code implementation • Proceedings of Machine Learning Research 2013 • Krikamol Muandet, David Balduzzi, Bernhard Schölkopf

This paper investigates domain generalization: How to take knowledge acquired from an arbitrary number of related domains and apply it to previously unseen domains?

no code implementations • 4 Jun 2013 • Krikamol Muandet, Kenji Fukumizu, Bharath Sriperumbudur, Arthur Gretton, Bernhard Schölkopf

A mean function in reproducing kernel Hilbert space, or a kernel mean, is an important part of many applications ranging from kernel principal component analysis to Hilbert-space embedding of distributions.

no code implementations • 1 Mar 2013 • Krikamol Muandet, Bernhard Schölkopf

We propose one-class support measure machines (OCSMMs) for group anomaly detection which aims at recognizing anomalous aggregate behaviors of data points.

no code implementations • NeurIPS 2012 • Krikamol Muandet, Kenji Fukumizu, Francesco Dinuzzo, Bernhard Schölkopf

This paper presents a kernel-based discriminative learning framework on probability measures.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.