Search Results for author: Natalie Maus

Found 5 papers, 3 papers with code

Joint Composite Latent Space Bayesian Optimization

no code implementations3 Nov 2023 Natalie Maus, Zhiyuan Jerry Lin, Maximilian Balandat, Eytan Bakshy

To effectively tackle these challenges, we introduce Joint Composite Latent Space Bayesian Optimization (JoCo), a novel framework that jointly trains neural network encoders and probabilistic models to adaptively compress high-dimensional input and output spaces into manageable latent representations.

Bayesian Optimization

Black Box Adversarial Prompting for Foundation Models

1 code implementation8 Feb 2023 Natalie Maus, Patrick Chao, Eric Wong, Jacob Gardner

Prompting interfaces allow users to quickly adjust the output of generative models in both vision and language.

Text Generation

Discovering Many Diverse Solutions with Bayesian Optimization

1 code implementation20 Oct 2022 Natalie Maus, Kaiwen Wu, David Eriksson, Jacob Gardner

Bayesian optimization (BO) is a popular approach for sample-efficient optimization of black-box objective functions.

Bayesian Optimization

Local Latent Space Bayesian Optimization over Structured Inputs

1 code implementation28 Jan 2022 Natalie Maus, Haydn T. Jones, Juston S. Moore, Matt J. Kusner, John Bradshaw, Jacob R. Gardner

By reformulating the encoder to function as both an encoder for the DAE globally and as a deep kernel for the surrogate model within a trust region, we better align the notion of local optimization in the latent space with local optimization in the input space.

Bayesian Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.