Despite the recent progress in hyperparameter optimization (HPO), available benchmarks that resemble real-world scenarios consist of a few and very large problem instances that are expensive to solve.
Conventional models tend to forget the knowledge of previous tasks while learning a new task, a phenomenon known as catastrophic forgetting.
Multi-fidelity methods are prominently used when cheaply-obtained, but possibly biased and noisy, observations must be effectively combined with limited or expensive true data in order to construct reliable models.
We tackle the problem of optimizing a black-box objective function defined over a highly-structured input space.
in-GPs respect the potentially complex boundary or interior conditions as well as the intrinsic geometry of the spaces.
The spatio-temporal field of protein concentration and mRNA expression are reconstructed without explicitly solving the partial differential equation.
Unsupervised learning on imbalanced data is challenging because, when given imbalanced data, current model is often dominated by the major category and ignores the categories with small amount of data.
We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model.
The Gaussian process latent variable model (GP-LVM) is a popular approach to non-linear probabilistic dimensionality reduction.
In this work, we present an extension of Gaussian process (GP) models with sophisticated parallelization and GPU acceleration.
In this work we introduce a mixture of GPs to address the data association problem, i. e. to label a group of observations according to the sources that generated them.