You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 2 Jan 2022 • Raul Astudillo, Peter I. Frazier

However, internal information about objective function computation is often available.

1 code implementation • NeurIPS 2021 • Raul Astudillo, Peter I. Frazier

We consider Bayesian optimization of the output of a network of functions, where each function takes as input the output of its parent nodes, and where the network takes significant time to evaluate.

no code implementations • 6 Dec 2021 • Yunxiang Zhang, Xiangyu Zhang, Peter I. Frazier

Recent advances in computationally efficient non-myopic Bayesian optimization (BO) improve query efficiency over traditional myopic methods like expected improvement while only modestly increasing computational cost.

1 code implementation • NeurIPS 2021 • Raul Astudillo, Daniel R. Jiang, Maximilian Balandat, Eytan Bakshy, Peter I. Frazier

To overcome the shortcomings of existing approaches, we propose the budgeted multi-step expected improvement, a non-myopic acquisition function that generalizes classical expected improvement to the setting of heterogeneous and unknown evaluation costs.

no code implementations • 25 Jul 2021 • Xiangyu Zhang, Peter I. Frazier

Thus, there is substantial value in understanding the performance of index policies and other policies that can be computed efficiently for large $N$.

no code implementations • 14 Nov 2019 • Raul Astudillo, Peter I. Frazier

The outcome of our approach is a menu of designs and evaluated attributes from which the DM makes a final selection.

no code implementations • 4 Jun 2019 • Raul Astudillo, Peter I. Frazier

We consider optimization of composite objective functions, i. e., of the form $f(x)=g(h(x))$, where $h$ is a black-box derivative-free expensive-to-evaluate function with vector-valued outputs, and $g$ is a cheap-to-evaluate real-valued function.

no code implementations • 12 Mar 2019 • Jian Wu, Saul Toscano-Palmerin, Peter I. Frazier, Andrew Gordon Wilson

Nonetheless, for hyperparameter tuning in deep neural networks, the time required to evaluate the validation error for even a few hyperparameter settings remains a bottleneck.

6 code implementations • 8 Jul 2018 • Peter I. Frazier

It builds a surrogate for the objective and quantifies the uncertainty in that surrogate using a Bayesian machine learning technique, Gaussian process regression, and then uses an acquisition function defined from this surrogate to decide where to sample.

1 code implementation • 23 Mar 2018 • Saul Toscano-Palmerin, Peter I. Frazier

We propose a Bayesian optimization algorithm for objective functions that are sums or integrals of expensive-to-evaluate functions, allowing noisy evaluations.

no code implementations • ICLR 2018 • Jian Wu, Peter I. Frazier

While Bayesian optimization (BO) has achieved great success in optimizing expensive-to-evaluate black-box functions, especially tuning hyperparameters of neural networks, methods such as random search (Li et al., 2016) and multi-fidelity BO (e. g. Klein et al. (2017)) that exploit cheap approximations, e. g. training on a smaller training data or with fewer iterations, can outperform standard BO approaches that use only full-fidelity observations.

no code implementations • 20 Jul 2017 • Jian Wu, Peter I. Frazier

This paper studies Bayesian ranking and selection (R&S) problems with correlated prior beliefs and continuous domains, i. e. Bayesian optimization (BO).

no code implementations • ICML 2017 • Bangrui Chen, Peter I. Frazier

We consider online content recommendation with implicit feedback through pairwise comparisons, formalized as the so-called dueling bandit problem.

1 code implementation • NeurIPS 2017 • Jian Wu, Matthias Poloczek, Andrew Gordon Wilson, Peter I. Frazier

Bayesian optimization has been successful at global optimization of expensive-to-evaluate multimodal objective functions.

no code implementations • 24 Feb 2017 • Stephen N. Pallone, Peter I. Frazier, Shane G. Henderson

Under certain noise assumptions, we show that the Bayes-optimal policy for maximally reducing entropy of the posterior distribution of this linear classifier is a greedy policy, and that this policy achieves a linear lower bound when alternatives can be constructed from the continuum.

no code implementations • 12 Dec 2016 • Peter I. Frazier, Shane G. Henderson, Rolf Waeber

The probabilistic bisection algorithm (PBA) solves a class of stochastic root-finding problems in one dimension by successively updating a prior belief on the location of the root based on noisy responses to queries at chosen points.

no code implementations • 11 Aug 2016 • Matthias Poloczek, Jialei Wang, Peter I. Frazier

We develop a framework for warm-starting Bayesian optimization, that reduces the solution time required to solve an optimization problem that is one in a sequence of related problems.

no code implementations • 11 Jul 2016 • J. Massey Cashore, Lemuel Kumarga, Peter I. Frazier

Bayesian optimization methods allocate limited sampling budgets to maximize expensive-to-evaluate functions.

2 code implementations • NeurIPS 2016 • Jian Wu, Peter I. Frazier

In many applications of black-box optimization, one can evaluate multiple points simultaneously, e. g. when evaluating the performances of several different neural network architectures in a parallel computing environment.

no code implementations • 30 May 2016 • Bangrui Chen, Peter I. Frazier

We present a Bayesian sequential decision-making formulation of the information filtering problem, in which an algorithm presents items (news articles, scientific papers, tweets) arriving in a stream, and learns relevance from user feedback on presented items.

no code implementations • 28 May 2016 • Bangrui Chen, Peter I. Frazier

We study dueling bandits with weak utility-based regret when preferences over arms have a total order and carry observable feature vectors.

no code implementations • NeurIPS 2017 • Matthias Poloczek, Jialei Wang, Peter I. Frazier

We consider Bayesian optimization of an expensive-to-evaluate black-box objective function, where we also have access to cheaper approximations of the objective.

no code implementations • 16 Feb 2016 • Jialei Wang, Scott C. Clark, Eric Liu, Peter I. Frazier

We also show that the resulting one-step Bayes optimal algorithm for parallel global optimization finds high-quality solutions with fewer evaluations than a heuristic based on approximately maximizing the q-EI.

no code implementations • 7 Feb 2016 • Saul Toscano-Palmerin, Peter I. Frazier

We consider derivative-free black-box global optimization of expensive noisy functions, when most of the randomness in the objective is produced by a few influential scalar random inputs.

no code implementations • 31 Dec 2015 • Weici Hu, Peter I. Frazier

We consider effort allocation in crowdsourcing, where we wish to assign labeling tasks to imperfect homogeneous crowd workers to maximize overall accuracy in a continuous-time Bayesian setting, subject to budget and time constraints.

1 code implementation • 3 Jun 2015 • Peter I. Frazier, Jialei Wang

We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets.

no code implementations • 25 May 2015 • J. Massey Cashore, Xiaoting Zhao, Alexander A. Alemi, Yujia Liu, Peter I. Frazier

Much of the data being created on the web contains interactions between users and items.

no code implementations • 29 Oct 2014 • Xiaoting Zhao, Peter I. Frazier

We focus on the cold-start setting for this problem, in which we have limited historical data on the user's preferences, and must rely on feedback from forwarded articles to learn which the fraction of items relevant to the user in each of several item categories.

no code implementations • 16 Jul 2014 • Weidong Han, Purnima Rajan, Peter I. Frazier, Bruno M. Jedynak

We consider the problem of group testing with sum observations and noiseless answers, in which we aim to locate multiple objects by querying the number of objects in each of a sequence of chosen sets.

no code implementations • 10 Jul 2014 • Ilya O. Ryzhov, Peter I. Frazier, Warren B. Powell

Approximate dynamic programming (ADP) has proven itself in a wide range of applications spanning large-scale transportation problems, health care, revenue management, and energy systems.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.