Search Results for author: David Eriksson

Found 19 papers, 14 papers with code

Unexpected Improvements to Expected Improvement for Bayesian Optimization

no code implementations NeurIPS 2023 Sebastian Ament, Samuel Daulton, David Eriksson, Maximilian Balandat, Eytan Bakshy

Expected Improvement (EI) is arguably the most popular acquisition function in Bayesian optimization and has found countless successful applications, but its performance is often exceeded by that of more recent methods.

Bayesian Optimization

Ecologically mapped neuronal identity: Towards standardizing activity across heterogeneous experiments

no code implementations28 Apr 2023 Kevin Luxem, David Eriksson

We review the possibility of adding area-specific environmental enrichment and automatized behavioral tasks to identify neurons in specific brain areas.

Hippocampus

Bayesian Optimization over High-Dimensional Combinatorial Spaces via Dictionary-based Embeddings

1 code implementation3 Mar 2023 Aryan Deshwal, Sebastian Ament, Maximilian Balandat, Eytan Bakshy, Janardhan Rao Doppa, David Eriksson

We use Bayesian Optimization (BO) and propose a novel surrogate modeling approach for efficiently handling a large number of binary and categorical parameters.

Bayesian Optimization Vocal Bursts Intensity Prediction

Discovering Many Diverse Solutions with Bayesian Optimization

1 code implementation20 Oct 2022 Natalie Maus, Kaiwen Wu, David Eriksson, Jacob Gardner

Bayesian optimization (BO) is a popular approach for sample-efficient optimization of black-box objective functions.

Bayesian Optimization

Bayesian Optimization over Discrete and Mixed Spaces via Probabilistic Reparameterization

2 code implementations18 Oct 2022 Samuel Daulton, Xingchen Wan, David Eriksson, Maximilian Balandat, Michael A. Osborne, Eytan Bakshy

We prove that under suitable reparameterizations, the BO policy that maximizes the probabilistic objective is the same as that which maximizes the AF, and therefore, PR enjoys the same regret bounds as the original BO policy using the underlying AF.

Bayesian Optimization

Sparse Bayesian Optimization

1 code implementation3 Mar 2022 Sulin Liu, Qing Feng, David Eriksson, Benjamin Letham, Eytan Bakshy

Bayesian optimization (BO) is a powerful approach to sample-efficient optimization of black-box objective functions.

Bayesian Optimization Recommendation Systems

Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization

no code implementations ICML Workshop AutoML 2021 David Eriksson, Pierce I-Jen Chuang, Samuel Daulton, Peng Xia, Akshat Shrivastava, Arun Babu, Shicong Zhao, Ahmed Aly, Ganesh Venkatesh, Maximilian Balandat

When tuning the architecture and hyperparameters of large machine learning models for on-device deployment, it is desirable to understand the optimal trade-offs between on-device latency and model accuracy.

Bayesian Optimization Natural Language Understanding +1

High-Dimensional Bayesian Optimization with Sparse Axis-Aligned Subspaces

2 code implementations27 Feb 2021 David Eriksson, Martin Jankowiak

Bayesian optimization (BO) is a powerful paradigm for efficient optimization of black-box objective functions.

Bayesian Optimization Vocal Bursts Intensity Prediction

Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization

1 code implementation NeurIPS 2020 Geoff Pleiss, Martin Jankowiak, David Eriksson, Anil Damle, Jacob R. Gardner

Matrix square roots and their inverses arise frequently in machine learning, e. g., when sampling from high-dimensional Gaussians $\mathcal{N}(\mathbf 0, \mathbf K)$ or whitening a vector $\mathbf b$ against covariance matrix $\mathbf K$.

Bayesian Optimization Gaussian Processes

Efficient Rollout Strategies for Bayesian Optimization

1 code implementation24 Feb 2020 Eric Hans Lee, David Eriksson, Bolong Cheng, Michael McCourt, David Bindel

Non-myopic acquisition functions consider the impact of the next $h$ function evaluations and are typically computed through rollout, in which $h$ steps of BO are simulated.

Bayesian Optimization

Scalable Constrained Bayesian Optimization

no code implementations20 Feb 2020 David Eriksson, Matthias Poloczek

The global optimization of a high-dimensional black-box function under black-box constraints is a pervasive task in machine learning, control, and engineering.

Bayesian Optimization

Scalable Global Optimization via Local Bayesian Optimization

2 code implementations NeurIPS 2019 David Eriksson, Michael Pearce, Jacob R. Gardner, Ryan Turner, Matthias Poloczek

This motivates the design of a local probabilistic approach for global optimization of large-scale high-dimensional problems.

Bayesian Optimization

pySOT and POAP: An event-driven asynchronous framework for surrogate optimization

3 code implementations30 Jul 2019 David Eriksson, David Bindel, Christine A. Shoemaker

This paper describes Plumbing for Optimization with Asynchronous Parallelism (POAP) and the Python Surrogate Optimization Toolbox (pySOT).

Bayesian Optimization

Scaling Gaussian Process Regression with Derivatives

1 code implementation NeurIPS 2018 David Eriksson, Kun Dong, Eric Hans Lee, David Bindel, Andrew Gordon Wilson

Gaussian processes (GPs) with derivatives are useful in many applications, including Bayesian optimization, implicit surface reconstruction, and terrain reconstruction.

Bayesian Optimization Dimensionality Reduction +3

Scalable Log Determinants for Gaussian Process Kernel Learning

3 code implementations NeurIPS 2017 Kun Dong, David Eriksson, Hannes Nickisch, David Bindel, Andrew Gordon Wilson

For applications as varied as Bayesian neural networks, determinantal point processes, elliptical graphical models, and kernel learning for Gaussian processes (GPs), one must compute a log determinant of an $n \times n$ positive definite matrix, and its derivatives - leading to prohibitive $\mathcal{O}(n^3)$ computations.

Gaussian Processes Point Processes

2HDMC - Two-Higgs-Doublet Model Calculator

1 code implementation5 Feb 2009 David Eriksson, Johan Rathsman, Oscar Stål

This manual describes the public code 2HDMC which can be used to perform calculations in a general, CP-conserving, two-Higgs-doublet model (2HDM).

High Energy Physics - Phenomenology

Cannot find the paper you are looking for? You can Submit a new open access paper.