no code implementations • 11 Mar 2025 • Yimeng Zeng, Natalie Maus, Haydn Thomas Jones, Jeffrey Tao, Fangping Wan, Marcelo Der Torossian Torres, Cesar de la Fuente-Nunez, Ryan Marcus, Osbert Bastani, Jacob R. Gardner
As this feedback loop continues, we find that the LLM is eventually able to generate solutions to new tasks in just a few shots that are better than the solutions produced by "from scratch" by Bayesian optimization while simultaneously requiring significantly fewer oracle calls.
no code implementations • 31 Jan 2025 • Natalie Maus, Kyurae Kim, Yimeng Zeng, Haydn Thomas Jones, Fangping Wan, Marcelo Der Torossian Torres, Cesar de la Fuente-Nunez, Jacob R. Gardner
In multi-objective black-box optimization, the goal is typically to find solutions that optimize a set of $T$ black-box objective functions, $f_1$, ..., $f_T$, simultaneously.
no code implementations • 6 Jun 2024 • Natalie Maus, Kyurae Kim, Geoff Pleiss, David Eriksson, John P. Cunningham, Jacob R. Gardner
Our approach outperforms standard SVGPs on high-dimensional benchmark tasks in control and molecular design.
1 code implementation • 3 Nov 2023 • Natalie Maus, Zhiyuan Jerry Lin, Maximilian Balandat, Eytan Bakshy
To effectively tackle these challenges, we introduce Joint Composite Latent Space Bayesian Optimization (JoCo), a novel framework that jointly trains neural network encoders and probabilistic models to adaptively compress high-dimensional input and output spaces into manageable latent representations.
no code implementations • 25 May 2023 • Natalie Maus, Yimeng Zeng, Daniel Allen Anderson, Phillip Maffettone, Aaron Solomon, Peyton Greenside, Osbert Bastani, Jacob R. Gardner
Furthermore, it is challenging to adapt pure generative approaches to other settings, e. g., when constraints exist.
1 code implementation • 8 Feb 2023 • Natalie Maus, Patrick Chao, Eric Wong, Jacob Gardner
Prompting interfaces allow users to quickly adjust the output of generative models in both vision and language.
1 code implementation • 20 Oct 2022 • Natalie Maus, Kaiwen Wu, David Eriksson, Jacob Gardner
Bayesian optimization (BO) is a popular approach for sample-efficient optimization of black-box objective functions.
1 code implementation • 28 Jan 2022 • Natalie Maus, Haydn T. Jones, Juston S. Moore, Matt J. Kusner, John Bradshaw, Jacob R. Gardner
By reformulating the encoder to function as both an encoder for the DAE globally and as a deep kernel for the surrogate model within a trust region, we better align the notion of local optimization in the latent space with local optimization in the input space.