Search Results for author: Daniel Hernández-Lobato

Found 30 papers, 12 papers with code

Deep Transformed Gaussian Processes

no code implementations27 Oct 2023 Francisco Javier Sáez-Maldonado, Juan Maroñas, Daniel Hernández-Lobato

In this work, we propose a generalization of TGPs named Deep Transformed Gaussian Processes (DTGPs), which follows the trend of concatenating layers of stochastic processes.

Gaussian Processes Variational Inference

Deep Variational Implicit Processes

1 code implementation14 Jun 2022 Luis A. Ortega, Simón Rodríguez Santana, Daniel Hernández-Lobato

This generalization is similar to that of deep GPs over GPs, but it is more flexible due to the use of IPs as the prior distribution over the latent functions.

Gaussian Processes Variational Inference

Efficient Transformed Gaussian Processes for Non-Stationary Dependent Multi-class Classification

no code implementations30 May 2022 Juan Maroñas, Daniel Hernández-Lobato

ETGPs exploit the recently proposed Transformed Gaussian Process (TGP), a stochastic process specified by transforming a Gaussian Process using an invertible transformation.

Gaussian Processes Multi-class Classification +1

Inference over radiative transfer models using variational and expectation maximization methods

1 code implementation7 Apr 2022 Daniel Heestermans Svendsen, Daniel Hernández-Lobato, Luca Martino, Valero Laparra, Alvaro Moreno, Gustau Camps-Valls

Radiative transfer models (RTMs) encode the energy transfer through the atmosphere, and are used to model and understand the Earth system, as well as to estimate the parameters that describe the status of the Earth from satellite observations by inverse modeling.

Earth Observation

Function-space Inference with Sparse Implicit Processes

1 code implementation14 Oct 2021 Simón Rodríguez Santana, Bryan Zaldivar, Daniel Hernández-Lobato

The result is a scalable method for approximate inference with IPs that can tune the prior IP parameters to the data, and that provides accurate non-Gaussian predictive distributions.

Gaussian Processes

Input Dependent Sparse Gaussian Processes

no code implementations15 Jul 2021 Bahram Jafrasteh, Carlos Villacampa-Calvo, Daniel Hernández-Lobato

For this, we use a neural network that receives the observed data as an input and outputs the inducing points locations and the parameters of $q$.

Gaussian Processes Variational Inference

Improved Max-value Entropy Search for Multi-objective Bayesian Optimization with Constraints

1 code implementation2 Nov 2020 Daniel Fernández-Sánchez, Eduardo C. Garrido-Merchán, Daniel Hernández-Lobato

MESMOC+ is also competitive with other information-based methods for constrained multi-objective Bayesian optimization, but it is significantly faster.

Bayesian Optimization

Parallel Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints

1 code implementation1 Apr 2020 Eduardo C. Garrido-Merchán, Daniel Hernández-Lobato

This article introduces PPESMOC, Parallel Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints, an information-based batch method for the simultaneous optimization of multiple expensive-to-evaluate black-box functions under the presence of several constraints.

Bayesian Optimization

Multi-class Gaussian Process Classification with Noisy Inputs

1 code implementation28 Jan 2020 Carlos Villacampa-Calvo, Bryan Zaldivar, Eduardo C. Garrido-Merchán, Daniel Hernández-Lobato

The results obtained show that, although the classification error is similar across methods, the predictive distribution of the proposed methods is better, in terms of the test log-likelihood, than the predictive distribution of a classifier based on GPs that ignores input noise.

BIG-bench Machine Learning Classification +4

Adversarial $α$-divergence Minimization for Bayesian Approximate Inference

1 code implementation13 Sep 2019 Simón Rodríguez Santana, Daniel Hernández-Lobato

Estimating the uncertainty in the predictions is a critical aspect with important applications, and one method to obtain this information is following a Bayesian approach to estimate a posterior distribution on the model parameters.

Bayesian Inference

Bayesian optimization of the PC algorithm for learning Gaussian Bayesian networks

no code implementations28 Jun 2018 Irene Córdoba, Eduardo C. Garrido-Merchán, Daniel Hernández-Lobato, Concha Bielza, Pedro Larrañaga

We show that the parameters found by a BO method outperform those found by a random search strategy and the expert recommendation.

Bayesian Optimization

Dealing with Categorical and Integer-valued Variables in Bayesian Optimization with Gaussian Processes

1 code implementation9 May 2018 Eduardo C. Garrido-Merchán, Daniel Hernández-Lobato

We show that this can lead to problems in the optimization process and describe a more principled approach to account for input variables that are categorical or integer-valued.

Bayesian Optimization Gaussian Processes

Dealing with Integer-valued Variables in Bayesian Optimization with Gaussian Processes

1 code implementation12 Jun 2017 Eduardo C. Garrido-Merchán, Daniel Hernández-Lobato

We show that this can lead to problems in the optimization process and describe a more principled approach to account for input variables that are integer-valued.

Bayesian Optimization Gaussian Processes

Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints

no code implementations5 Sep 2016 Eduardo C. Garrido-Merchán, Daniel Hernández-Lobato

This work presents PESMOC, Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints, an information-based strategy for the simultaneous optimization of multiple expensive-to-evaluate black-box functions under the presence of several constraints.

Bayesian Optimization

Deep Gaussian Processes for Regression using Approximate Expectation Propagation

no code implementations12 Feb 2016 Thang D. Bui, Daniel Hernández-Lobato, Yingzhen Li, José Miguel Hernández-Lobato, Richard E. Turner

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers.

Gaussian Processes regression

Predictive Entropy Search for Multi-objective Bayesian Optimization

no code implementations17 Nov 2015 Daniel Hernández-Lobato, José Miguel Hernández-Lobato, Amar Shah, Ryan P. Adams

The results show that PESMO produces better recommendations with a smaller number of evaluations of the objectives, and that a decoupled evaluation can lead to improvements in performance, particularly when the number of objectives is large.

Bayesian Optimization

Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation

no code implementations11 Nov 2015 Thang D. Bui, José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Richard E. Turner

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers.

Gaussian Processes

Black-box $α$-divergence Minimization

3 code implementations10 Nov 2015 José Miguel Hernández-Lobato, Yingzhen Li, Mark Rowland, Daniel Hernández-Lobato, Thang Bui, Richard E. Turner

Black-box alpha (BB-$\alpha$) is a new approximate inference method based on the minimization of $\alpha$-divergences.

General Classification regression

Scalable Gaussian Process Classification via Expectation Propagation

no code implementations16 Jul 2015 Daniel Hernández-Lobato, José Miguel Hernández-Lobato

Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets.

Classification General Classification

Non-linear Causal Inference using Gaussianity Measures

no code implementations16 Sep 2014 Daniel Hernández-Lobato, Pablo Morales-Mombiela, David Lopez-Paz, Alberto Suárez

The problem of non-linear causal inference is addressed by performing an embedding in an expanded feature space, in which the relation between causes and effects can be assumed to be linear.

Causal Inference

Mind the Nuisance: Gaussian Process Classification using Privileged Noise

no code implementations NeurIPS 2014 Daniel Hernández-Lobato, Viktoriia Sharmanska, Kristian Kersting, Christoph H. Lampert, Novi Quadrianto

That is, in contrast to the standard GPC setting, the latent function is not just a nuisance but a feature: it becomes a natural measure of confidence about the training data by modulating the slope of the GPC sigmoid likelihood function.

Classification General Classification

Learning Feature Selection Dependencies in Multi-task Learning

no code implementations NeurIPS 2013 Daniel Hernández-Lobato, José Miguel Hernández-Lobato

Because the process of estimating feature selection dependencies may suffer from over-fitting in the model proposed, additional data from a multi-task learning scenario are considered for induction.

feature selection Multi-Task Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.