Search Results for author: Victor Picheny

Found 22 papers, 3 papers with code

Bayesian optimization under mixed constraints with a slack-variable augmented Lagrangian

no code implementations NeurIPS 2016 Victor Picheny, Robert B. Gramacy, Stefan M. Wild, Sebastien Le Digabel

An augmented Lagrangian (AL) can convert a constrained optimization problem into a sequence of simpler (e. g., unconstrained) problems, which are then usually solved with local solvers.

Bayesian Optimization

A Bayesian optimization approach to find Nash equilibria

no code implementations8 Nov 2016 Victor Picheny, Mickael Binois, Abderrahmane Habbal

Game theory finds nowadays a broad range of applications in engineering and machine learning.

Bayesian Optimization

Budgeted Multi-Objective Optimization with a Focus on the Central Part of the Pareto Front -- Extended Version

no code implementations27 Sep 2018 David Gaudrie, Rodolphe Le Riche, Victor Picheny, Benoit Enaux, Vincent Herbert

When the number of experiments is severely restricted and/or when the number of objectives increases, uncovering the whole set of Pareto optimal solutions is out of reach, even for surrogate-based approaches: the proposed solutions are sub-optimal or do not cover the front well.

A Review on Quantile Regression for Stochastic Computer Experiments

no code implementations23 Jan 2019 Léonard Torossian, Victor Picheny, Robert Faivre, Aurélien Garivier

We report on an empirical study of the main strategies for quantile regression in the context of stochastic computer experiments.

regression

The Kalai-Smorodinski solution for many-objective Bayesian optimization

no code implementations18 Feb 2019 Mickaël Binois, Victor Picheny, Patrick Taillandier, Abderrahmane Habbal

An ongoing aim of research in multiobjective Bayesian optimization is to extend its applicability to a large number of objectives.

Bayesian Optimization

X-Armed Bandits: Optimizing Quantiles, CVaR and Other Risks

no code implementations17 Apr 2019 Léonard Torossian, Aurélien Garivier, Victor Picheny

We finally present numerical experiments that show a dramatic impact of tight bounds for the optimization of quantiles and CVaR.

Decision Making

Ordinal Bayesian Optimisation

no code implementations5 Dec 2019 Victor Picheny, Sattar Vakili, Artem Artemev

Bayesian optimisation is a powerful tool to solve expensive black-box problems, but fails when the stationary assumption made on the objective function is strongly violated, which is the case in particular for ill-conditioned or discontinuous objectives.

Bayesian Optimisation Thompson Sampling

Bayesian Quantile and Expectile Optimisation

no code implementations12 Jan 2020 Victor Picheny, Henry Moss, Léonard Torossian, Nicolas Durrande

In this paper, we propose new variational models for Bayesian quantile and expectile regression that are well-suited for heteroscedastic noise settings.

Bayesian Optimisation Gaussian Processes +1

Scalable Thompson Sampling using Sparse Gaussian Process Models

no code implementations NeurIPS 2021 Sattar Vakili, Henry Moss, Artem Artemev, Vincent Dutordoir, Victor Picheny

We provide theoretical guarantees and show that the drastic reduction in computational complexity of scalable TS can be enjoyed without loss in the regret performance over the standard TS.

Thompson Sampling

On Information Gain and Regret Bounds in Gaussian Process Bandits

no code implementations15 Sep 2020 Sattar Vakili, Kia Khezeli, Victor Picheny

For the Mat\'ern family of kernels, where the lower bounds on $\gamma_T$, and regret under the frequentist setting, are known, our results close a huge polynomial in $T$ gap between the upper and lower bounds (up to logarithmic in $T$ factors).

TREGO: a Trust-Region Framework for Efficient Global Optimization

no code implementations18 Jan 2021 Youssef Diouane, Victor Picheny, Rodolphe Le Riche, Alexandre Scotto Di Perrotolo

By following a classical scheme for the trust region (based on a sufficient decrease condition), the proposed algorithm enjoys global convergence properties, while departing from EGO only for a subset of optimization steps.

Bayesian Optimization

Revisiting Bayesian Optimization in the light of the COCO benchmark

1 code implementation30 Mar 2021 Rodolphe Le Riche, Victor Picheny

It is commonly believed that Bayesian optimization (BO) algorithms are highly efficient for optimizing numerically costly functions.

Bayesian Optimization

Information-theoretic Inducing Point Placement for High-throughput Bayesian Optimisation

no code implementations6 Jun 2022 Henry B. Moss, Sebastian W. Ober, Victor Picheny

By choosing inducing points to maximally reduce both global uncertainty and uncertainty in the maximum value of the objective function, we build surrogate models able to support high-precision high-throughput BO.

Bayesian Optimisation Gaussian Processes +1

A penalisation method for batch multi-objective Bayesian optimisation with application in heat exchanger design

1 code implementation27 Jun 2022 Andrei Paleyes, Henry B. Moss, Victor Picheny, Piotr Zulawski, Felix Newman

We present HIghly Parallelisable Pareto Optimisation (HIPPO) -- a batch acquisition function that enables multi-objective Bayesian optimisation methods to efficiently exploit parallel processing resources.

Bayesian Optimisation

Fantasizing with Dual GPs in Bayesian Optimization and Active Learning

no code implementations2 Nov 2022 Paul E. Chang, Prakhar Verma, ST John, Victor Picheny, Henry Moss, Arno Solin

Gaussian processes (GPs) are the main surrogate functions used for sequential modelling such as Bayesian Optimization and Active Learning.

Active Learning Bayesian Optimization +1

Inducing Point Allocation for Sparse Gaussian Processes in High-Throughput Bayesian Optimisation

no code implementations24 Jan 2023 Henry B. Moss, Sebastian W. Ober, Victor Picheny

Sparse Gaussian Processes are a key component of high-throughput Bayesian Optimisation (BO) loops; however, we show that existing methods for allocating their inducing points severely hamper optimisation performance.

Bayesian Optimisation Decision Making +2

Spherical Inducing Features for Orthogonally-Decoupled Gaussian Processes

no code implementations27 Apr 2023 Louis C. Tiao, Vincent Dutordoir, Victor Picheny

Despite their many desirable properties, Gaussian processes (GPs) are often compared unfavorably to deep neural networks (NNs) for lacking the ability to learn representations.

Gaussian Processes

Combining additivity and active subspaces for high-dimensional Gaussian process modeling

no code implementations6 Feb 2024 Mickael Binois, Victor Picheny

Gaussian processes are a widely embraced technique for regression and classification due to their good prediction accuracy, analytical tractability and built-in capabilities for uncertainty quantification.

Gaussian Processes Uncertainty Quantification

Cannot find the paper you are looking for? You can Submit a new open access paper.