no code implementations • 22 May 2024 • François Bachoc, Nicolò Cesa-Bianchi, Tommaso Cesari, Roberto Colomboni
In online bilateral trade, a platform posts prices to incoming pairs of buyers and sellers that have private valuations for a certain good.
no code implementations • 22 May 2024 • François Bachoc, Tommaso Cesari, Roberto Colomboni
- If only their willingness to sell or buy at the proposed price is revealed after each interaction, we provide an algorithm achieving $O(\sqrt{LdT \ln T })$ regret, and show that this rate is optimal (up to logarithmic factors), via a lower bound of $\Omega(\sqrt{LdT})$.
1 code implementation • 13 Oct 2023 • Julien Demange-Chryst, François Bachoc, Jérôme Morio, Timothé Krauth
In this contribution, we suggest to use as the approximating model a distribution parameterised by a variational autoencoder.
1 code implementation • 28 Aug 2023 • François Bachoc, Louis Béthune, Alberto González-Sanz, Jean-Michel Loubes
In this paper, we improve the learning theory of kernel distribution regression.
no code implementations • 12 Oct 2022 • François Bachoc, Louis Béthune, Alberto Gonzalez-Sanz, Jean-Michel Loubes
We present a novel kernel over the space of probability measures based on the dual formulation of optimal regularized transport.
no code implementations • 21 Sep 2022 • François Bachoc, Christoph Muehlmann, Klaus Nordhausen, Joni Virta
Non-stationary source separation is a well-established branch of blind source separation with many different methods.
no code implementations • 2 Sep 2022 • François Bachoc, Tommaso Cesari, Roberto Colomboni, Andrea Paudice
We analyze the cumulative regret of the Dyadic Search algorithm of Bachoc et al. [2022].
no code implementations • 13 Aug 2022 • François Bachoc, Tommaso Cesari, Roberto Colomboni, Andrea Paudice
This paper studies a natural generalization of the problem of minimizing a univariate convex function $f$ by querying its values sequentially.
no code implementations • 15 Jun 2022 • Joachim Bona-Pellissier, François Malgouyres, François Bachoc
Is a sample rich enough to determine, at least locally, the parameters of a neural network?
1 code implementation • 17 May 2022 • Andrés F. López-Lopera, François Bachoc, Olivier Roustant
First, we show that our framework enables to satisfy the constraints everywhere in the input space.
no code implementations • 24 Dec 2021 • Joachim Bona-Pellissier, François Bachoc, François Malgouyres
The possibility for one to recover the parameters-weights and biases-of a neural network thanks to the knowledge of its function on a subset of the input space can be, depending on the situation, a curse or a blessing.
no code implementations • NeurIPS 2021 • François Bachoc, Tommaso R Cesari, Sébastien Gerchinovitz
We study the problem of zeroth-order (black-box) optimization of a Lipschitz function $f$ defined on a compact subset $\mathcal X$ of $\mathbb R^d$, with the additional constraint that algorithms must certify the accuracy of their recommendations.
no code implementations • 26 Oct 2020 • François Bachoc, Tommaso Cesari, Sébastien Gerchinovitz
We study the problem of approximating the level set of an unknown function by sequentially querying its values.
no code implementations • 31 Oct 2019 • Mai Trang Vu, François Bachoc, Edouard Pauwels
We consider the problem of estimating the support of a measure from a finite, independent, sample.
no code implementations • 15 Jan 2019 • Andrés F. López-Lopera, François Bachoc, Nicolas Durrande, Jérémy Rohmer, Déborah Idier, Olivier Roustant
Finally, on 2D and 5D coastal flooding applications, we show that more flexible and realistic GP implementations can be obtained by considering noise effects and by enforcing the (linear) inequality constraints.
2 code implementations • 18 Oct 2018 • François Bachoc, Fabrice Gamboa, Max Halford, Jean-Michel Loubes, Laurent Risser
In order to emphasize the impact of each input variable, this formalism uses an information theory framework that quantifies the influence of all input-output observations based on entropic projections.
1 code implementation • 10 Apr 2018 • François Bachoc, Agnès Lagnoux, Andrés F. López-Lopera
We first show that the (unconstrained) maximum likelihood estimator has the same asymptotic distribution, unconditionally and conditionally, to the fact that the Gaussian process satisfies the inequality constraints.
Statistics Theory Probability Statistics Theory
no code implementations • 16 Mar 2018 • François Bachoc, Baptiste Broto, Fabrice Gamboa, Jean-Michel Loubes
In the framework of the supervised learning of a real function defined on a space X , the so called Kriging method stands on a real Gaussian field defined on X.
1 code implementation • 20 Oct 2017 • Andrés F. López-Lopera, François Bachoc, Nicolas Durrande, Olivier Roustant
Introducing inequality constraints in Gaussian process (GP) models can lead to more realistic uncertainties in learning a great variety of real-world problems.
no code implementations • 31 Jan 2017 • François Bachoc, Fabrice Gamboa, Jean-Michel Loubes, Nil Venet
We prove that the Gaussian processes indexed by distributions corresponding to these kernels can be efficiently forecast, opening new perspectives in Gaussian process modeling.
no code implementations • 3 Aug 2016 • Julien Bect, François Bachoc, David Ginsbourger
Thisobservation enables us to establish generic consistency results for abroad class of SUR strategies.
1 code implementation • 19 Jul 2016 • Didier Rullière, Nicolas Durrande, François Bachoc, Clément Chevalier
This work falls within the context of predicting the value of a real function at some input locations given a limited number of observations of this function.