Search Results for author: Dino Sejdinovic

Found 50 papers, 16 papers with code

Inter-domain Deep Gaussian Processes with RKHS Fourier Features

no code implementations ICML 2020 Tim G. J. Rudner, Dino Sejdinovic, Yarin Gal

We propose Inter-domain Deep Gaussian Processes with RKHS Fourier Features, an extension of shallow inter-domain GPs that combines the advantages of inter-domain and deep Gaussian processes (DGPs) and demonstrate how to leverage existing approximate inference approaches to perform simple and scalable approximate inference on Inter-domain Deep Gaussian Processes.

Gaussian Processes

RKHS-SHAP: Shapley Values for Kernel Methods

no code implementations18 Oct 2021 Siu Lun Chau, Javier Gonzalez, Dino Sejdinovic

Feature attribution for kernel methods is often heuristic and not individualised for each prediction.

BayesIMP: Uncertainty Quantification for Causal Data Fusion

no code implementations7 Jun 2021 Siu Lun Chau, Jean-François Ton, Javier González, Yee Whye Teh, Dino Sejdinovic

While causal models are becoming one of the mainstays of machine learning, the problem of uncertainty quantification in causal inference remains challenging.

Bayesian Optimisation Causal Inference

Connections and Equivalences between the Nyström Method and Sparse Variational Gaussian Processes

no code implementations2 Jun 2021 Veit Wild, Motonobu Kanagawa, Dino Sejdinovic

While sparse approximation methods for GPs and kernel methods share some algebraic similarities, the literature lacks a deep understanding of how and why they are related.

Gaussian Processes

Deconditional Downscaling with Gaussian Processes

no code implementations27 May 2021 Siu Lun Chau, Shahine Bouabid, Dino Sejdinovic

Yet, when LR samples are modeled as aggregate conditional means of HR samples with respect to a mediating variable that is globally observed, the recovery of the underlying fine-grained field can be framed as taking an "inverse" of the conditional expectation, namely a deconditioning problem.

Gaussian Processes

Time-to-event regression using partially monotonic neural networks

no code implementations26 Mar 2021 David Rindt, Robert Hu, David Steinsaltz, Dino Sejdinovic

We propose a novel method, termed SuMo-net, that uses partially monotonic neural networks to learn a time-to-event distribution from a sample of covariates and right-censored times.

Inter-domain Deep Gaussian Processes

no code implementations1 Nov 2020 Tim G. J. Rudner, Dino Sejdinovic, Yarin Gal

We propose Inter-domain Deep Gaussian Processes, an extension of inter-domain shallow GPs that combines the advantages of inter-domain and deep Gaussian processes (DGPs), and demonstrate how to leverage existing approximate inference methods to perform simple and scalable approximate inference using inter-domain features in DGPs.

Gaussian Processes

Kernel-based Graph Learning from Smooth Signals: A Functional Viewpoint

no code implementations23 Aug 2020 Xingyue Pu, Siu Lun Chau, Xiaowen Dong, Dino Sejdinovic

In this paper, we propose a novel graph learning framework that incorporates the node-side and observation-side information, and in particular the covariates that help to explain the dependency structures in graph signals.

Graph Learning

Benign Overfitting and Noisy Features

no code implementations6 Aug 2020 Zhu Li, Weijie Su, Dino Sejdinovic

Modern machine learning often operates in the regime where the number of parameters is much higher than the number of data points, with zero training loss and yet good generalization, thereby contradicting the classical bias-variance trade-off.

Variational Inference with Continuously-Indexed Normalizing Flows

1 code implementation10 Jul 2020 Anthony Caterini, Rob Cornish, Dino Sejdinovic, Arnaud Doucet

Continuously-indexed flows (CIFs) have recently achieved improvements over baseline normalizing flows on a variety of density estimation tasks.

Bayesian Inference Density Estimation +2

Meta Learning for Causal Direction

no code implementations6 Jul 2020 Jean-Francois Ton, Dino Sejdinovic, Kenji Fukumizu

Based on recent developments in meta learning as well as in causal inference, we introduce a novel generative model that allows distinguishing cause and effect in the small data setting.

Causal Inference Meta-Learning

A Perspective on Gaussian Processes for Earth Observation

no code implementations2 Jul 2020 Gustau Camps-Valls, Dino Sejdinovic, Jakob Runge, Markus Reichstein

Earth observation (EO) by airborne and satellite remote sensing and in-situ observations play a fundamental role in monitoring our planet.

Causal Inference Gaussian Processes

Learning Inconsistent Preferences with Kernel Methods

no code implementations6 Jun 2020 Siu Lun Chau, Javier González, Dino Sejdinovic

We propose a probabilistic kernel approach for preferential learning from pairwise duelling data using Gaussian Processes.

Gaussian Processes

Spectral Ranking with Covariates

no code implementations8 May 2020 Siu Lun Chau, Mihai Cucuringu, Dino Sejdinovic

We consider approaches to the classical problem of establishing a statistical ranking on a given set of items from incomplete and noisy pairwise comparisons, and propose spectral algorithms able to leverage available covariate information about the items.

Large Scale Tensor Regression using Kernels and Variational Inference

no code implementations11 Feb 2020 Robert Hu, Geoff K. Nicholls, Dino Sejdinovic

We outline an inherent weakness of tensor factorization models when latent factors are expressed as a function of side information and propose a novel method to mitigate this weakness.

Variational Inference

A kernel log-rank test of independence for right-censored data

1 code implementation8 Dec 2019 Tamara Fernandez, Arthur Gretton, David Rindt, Dino Sejdinovic

We introduce a general non-parametric independence test between right-censored survival times and covariates, which may be multivariate.

Survival Analysis

Detecting anthropogenic cloud perturbations with deep learning

no code implementations29 Nov 2019 Duncan Watson-Parris, Samuel Sutherland, Matthew Christensen, Anthony Caterini, Dino Sejdinovic, Philip Stier

One of the most pressing questions in climate science is that of the effect of anthropogenic aerosol on the Earth's energy balance.

Kernel Dependence Regularizers and Gaussian Processes with Applications to Algorithmic Fairness

no code implementations11 Nov 2019 Zhu Li, Adrian Perez-Suay, Gustau Camps-Valls, Dino Sejdinovic

We present a regularization approach to this problem that trades off predictive accuracy of the learned models (with respect to biased labels) for the fairness in terms of statistical parity, i. e. independence of the decisions from the sensitive covariates.

Crime Prediction Fairness +1

A kernel- and optimal transport- based test of independence between covariates and right-censored lifetimes

1 code implementation10 Jun 2019 David Rindt, Dino Sejdinovic, David Steinsaltz

We propose a nonparametric test of independence, termed optHSIC, between a covariate and a right-censored lifetime.

Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings

no code implementations5 Jun 2019 Jean-Francois Ton, Lucian Chan, Yee Whye Teh, Dino Sejdinovic

Current meta-learning approaches focus on learning functional representations of relationships between variables, i. e. on estimating conditional expectations in regression.

Density Estimation Meta-Learning

Infinitely Deep Infinite-Width Networks

no code implementations ICLR 2019 Jovana Mitrovic, Peter Wirnsberger, Charles Blundell, Dino Sejdinovic, Yee Whye Teh

Infinite-width neural networks have been extensively used to study the theoretical properties underlying the extraordinary empirical success of standard, finite-width neural networks.

Rejoinder for "Probabilistic Integration: A Role in Statistical Computation?"

no code implementations26 Nov 2018 Francois-Xavier Briol, Chris. J. Oates, Mark Girolami, Michael A. Osborne, Dino Sejdinovic

This article is the rejoinder for the paper "Probabilistic Integration: A Role in Statistical Computation?"

Hyperparameter Learning via Distributional Transfer

1 code implementation NeurIPS 2019 Ho Chung Leon Law, Peilin Zhao, Lucian Chan, Junzhou Huang, Dino Sejdinovic

Bayesian optimisation is a popular technique for hyperparameter learning but typically requires initial exploration even in cases where similar prior tasks have been solved.

Bayesian Optimisation

A Differentially Private Kernel Two-Sample Test

1 code implementation1 Aug 2018 Anant Raj, Ho Chung Leon Law, Dino Sejdinovic, Mijung Park

As a result, a simple chi-squared test is obtained, where a test statistic depends on a mean and covariance of empirical differences between the samples, which we perturb for a privacy guarantee.

Two-sample testing

Gaussian Processes and Kernel Methods: A Review on Connections and Equivalences

no code implementations6 Jul 2018 Motonobu Kanagawa, Philipp Hennig, Dino Sejdinovic, Bharath K. Sriperumbudur

This paper is an attempt to bridge the conceptual gaps between researchers working on the two widely used approaches based on positive definite kernels: Bayesian learning or inference using Gaussian processes on the one side, and frequentist kernel methods based on reproducing kernel Hilbert spaces on the other.

Gaussian Processes

Towards A Unified Analysis of Random Fourier Features

no code implementations24 Jun 2018 Zhu Li, Jean-Francois Ton, Dino Oglic, Dino Sejdinovic

We study both the standard random Fourier features method for which we improve the existing bounds on the number of features required to guarantee the corresponding minimax risk convergence rate of kernel ridge regression, as well as a data-dependent modification which samples features proportional to \emph{ridge leverage scores} and further reduces the required number of features.

Hamiltonian Variational Auto-Encoder

3 code implementations NeurIPS 2018 Anthony L. Caterini, Arnaud Doucet, Dino Sejdinovic

However, for this methodology to be practically efficient, it is necessary to obtain low-variance unbiased estimators of the ELBO and its gradients with respect to the parameters of interest.

Latent Variable Models Variational Inference

Variational Learning on Aggregate Outputs with Gaussian Processes

1 code implementation NeurIPS 2018 Ho Chung Leon Law, Dino Sejdinovic, Ewan Cameron, Tim CD Lucas, Seth Flaxman, Katherine Battle, Kenji Fukumizu

While a typical supervised learning framework assumes that the inputs and the outputs are measured at the same levels of granularity, many applications, including global mapping of disease, only have access to outputs at a much coarser level than that of the inputs.

Gaussian Processes

Bayesian Approaches to Distribution Regression

1 code implementation11 May 2017 Ho Chung Leon Law, Danica J. Sutherland, Dino Sejdinovic, Seth Flaxman

Distribution regression has recently attracted much interest as a generic solution to the problem of supervised learning where labels are available at the group level, rather than at the individual level.

Testing and Learning on Distributions with Symmetric Noise Invariance

no code implementations NeurIPS 2017 Ho Chung Leon Law, Christopher Yau, Dino Sejdinovic

Kernel embeddings of distributions and the Maximum Mean Discrepancy (MMD), the resulting distance between distributions, are useful tools for fully nonparametric two-sample testing and learning on distributions.

Two-sample testing

Detecting causal associations in large nonlinear time series datasets

2 code implementations22 Feb 2017 Jakob Runge, Dino Sejdinovic, Seth Flaxman

Detecting causal associations in time series datasets is a key challenge for novel insights into complex dynamical systems such as the Earth system or the human brain.

Methodology Atmospheric and Oceanic Physics Applications

Poisson intensity estimation with reproducing kernels

no code implementations27 Oct 2016 Seth Flaxman, Yee Whye Teh, Dino Sejdinovic

However, we prove that the representer theorem does hold in an appropriately transformed RKHS, guaranteeing that the optimization of the penalized likelihood can be cast as a tractable finite-dimensional problem.

Large-Scale Kernel Methods for Independence Testing

1 code implementation25 Jun 2016 Qinyi Zhang, Sarah Filippi, Arthur Gretton, Dino Sejdinovic

Representations of probability measures in reproducing kernel Hilbert spaces provide a flexible framework for fully nonparametric hypothesis tests of independence, which can capture any type of departure from independence, including nonlinear associations and multivariate interactions.

Hyperspectral Image Classification with Support Vector Machines on Kernel Distribution Embeddings

no code implementations30 May 2016 Gianni Franchi, Jesus Angulo, Dino Sejdinovic

We propose a novel approach for pixel classification in hyperspectral images, leveraging on both the spatial and spectral information in the data.

Classification General Classification +1

Bayesian Learning of Kernel Embeddings

no code implementations7 Mar 2016 Seth Flaxman, Dino Sejdinovic, John P. Cunningham, Sarah Filippi

The posterior mean of our model is closely related to recently proposed shrinkage estimators for kernel mean embeddings, while the posterior uncertainty is a new, interesting feature with various possible applications.

Bayesian Inference

DR-ABC: Approximate Bayesian Computation with Kernel-Based Distribution Regression

no code implementations15 Feb 2016 Jovana Mitrovic, Dino Sejdinovic, Yee Whye Teh

Approximate Bayesian computation (ABC) is an inference framework that constructs an approximation to the true likelihood based on the similarity between the observed and simulated data as measured by a predefined set of summary statistics.

Probabilistic Integration: A Role in Statistical Computation?

no code implementations3 Dec 2015 François-Xavier Briol, Chris. J. Oates, Mark Girolami, Michael A. Osborne, Dino Sejdinovic

A research frontier has emerged in scientific computation, wherein numerical error is regarded as a source of epistemic uncertainty that can be modelled.

Numerical Integration

Kernel Sequential Monte Carlo

1 code implementation11 Oct 2015 Ingmar Schuster, Heiko Strathmann, Brooks Paige, Dino Sejdinovic

As KSMC does not require access to target gradients, it is particularly applicable on targets whose gradients are unknown or prohibitively expensive.

Fast Two-Sample Testing with Analytic Representations of Probability Measures

1 code implementation NeurIPS 2015 Kacper Chwialkowski, Aaditya Ramdas, Dino Sejdinovic, Arthur Gretton

The new tests are consistent against a larger class of alternatives than the previous linear-time tests based on the (non-smoothed) empirical characteristic functions, while being much faster than the current state-of-the-art quadratic-time kernel-based or energy distance-based tests.

Two-sample testing

Kernel-Based Just-In-Time Learning for Passing Expectation Propagation Messages

1 code implementation9 Mar 2015 Wittawat Jitkrittum, Arthur Gretton, Nicolas Heess, S. M. Ali Eslami, Balaji Lakshminarayanan, Dino Sejdinovic, Zoltán Szabó

We propose an efficient nonparametric strategy for learning a message operator in expectation propagation (EP), which takes as input the set of incoming messages to a factor node, and produces an outgoing message as output.

K2-ABC: Approximate Bayesian Computation with Kernel Embeddings

no code implementations9 Feb 2015 Mijung Park, Wittawat Jitkrittum, Dino Sejdinovic

Complicated generative models often result in a situation where computing the likelihood of observed data is intractable, while simulating from the conditional density given a parameter value is relatively easy.

Unbiased Bayes for Big Data: Paths of Partial Posteriors

no code implementations14 Jan 2015 Heiko Strathmann, Dino Sejdinovic, Mark Girolami

A key quantity of interest in Bayesian inference are expectations of functions with respect to a posterior distribution.

Bayesian Inference

A Wild Bootstrap for Degenerate Kernel Tests

1 code implementation NeurIPS 2014 Kacper Chwialkowski, Dino Sejdinovic, Arthur Gretton

A wild bootstrap method for nonparametric hypothesis tests based on kernel distribution embeddings is proposed.

Time Series

Kernel Adaptive Metropolis-Hastings

1 code implementation19 Jul 2013 Dino Sejdinovic, Heiko Strathmann, Maria Lomeli Garcia, Christophe Andrieu, Arthur Gretton

A Kernel Adaptive Metropolis-Hastings algorithm is introduced, for the purpose of sampling from a target distribution with strongly nonlinear support.

A Kernel Test for Three-Variable Interactions

no code implementations NeurIPS 2013 Dino Sejdinovic, Arthur Gretton, Wicher Bergsma

We introduce kernel nonparametric tests for Lancaster three-variable interaction and for total independence, using embeddings of signed measures into a reproducing kernel Hilbert space.

Equivalence of distance-based and RKHS-based statistics in hypothesis testing

no code implementations25 Jul 2012 Dino Sejdinovic, Bharath Sriperumbudur, Arthur Gretton, Kenji Fukumizu

We provide a unifying framework linking two classes of statistics used in two-sample and independence testing: on the one hand, the energy distances and distance covariances from the statistics literature; on the other, maximum mean discrepancies (MMD), that is, distances between embeddings of distributions to reproducing kernel Hilbert spaces (RKHS), as established in machine learning.

Two-sample testing

Cannot find the paper you are looking for? You can Submit a new open access paper.