Search Results for author: Jacob Gardner

Found 8 papers, 6 papers with code

Generative Adversarial Bayesian Optimization for Surrogate Objectives

1 code implementation9 Feb 2024 Michael S. Yao, Yimeng Zeng, Hamsa Bastani, Jacob Gardner, James C. Gee, Osbert Bastani

To address this limitation, we propose generative adversarial Bayesian optimization (GABO) using adaptive source critic regularization, a task-agnostic framework for Bayesian optimization that employs a Lipschitz-bounded source critic model to constrain the optimization trajectory to regions where the surrogate function is reliable.

Bayesian Optimization

Learning Performance-Improving Code Edits

2 code implementations15 Feb 2023 Alexander Shypula, Aman Madaan, Yimeng Zeng, Uri Alon, Jacob Gardner, Milad Hashemi, Graham Neubig, Parthasarathy Ranganathan, Osbert Bastani, Amir Yazdanbakhsh

Next, we propose a broad range of adaptation strategies for code optimization; for prompting, these include retrieval-based few-shot prompting and chain-of-thought, and for finetuning, these include performance-conditioned generation and synthetic data augmentation based on self-play.

Code Generation Code Repair +2

Learning to Select Pivotal Samples for Meta Re-weighting

1 code implementation9 Feb 2023 Yinjun Wu, Adam Stein, Jacob Gardner, Mayur Naik

In this paper, we study how to learn to identify such a meta sample set from a large, imperfect training set, that is subsequently cleaned and used to optimize performance in the meta re-weighting setting.

Clustering Computational Efficiency

Black Box Adversarial Prompting for Foundation Models

1 code implementation8 Feb 2023 Natalie Maus, Patrick Chao, Eric Wong, Jacob Gardner

Prompting interfaces allow users to quickly adjust the output of generative models in both vision and language.

Text Generation

Discovering Many Diverse Solutions with Bayesian Optimization

1 code implementation20 Oct 2022 Natalie Maus, Kaiwen Wu, David Eriksson, Jacob Gardner

Bayesian optimization (BO) is a popular approach for sample-efficient optimization of black-box objective functions.

Bayesian Optimization

Neural Likelihoods for Multi-Output Gaussian Processes

no code implementations31 May 2019 Martin Jankowiak, Jacob Gardner

We construct flexible likelihoods for multi-output Gaussian process models that leverage neural networks as components.

Gaussian Processes Variational Inference

Deep Feature Interpolation for Image Content Changes

2 code implementations CVPR 2017 Paul Upchurch, Jacob Gardner, Geoff Pleiss, Robert Pless, Noah Snavely, Kavita Bala, Kilian Weinberger

We propose Deep Feature Interpolation (DFI), a new data-driven baseline for automatic high-resolution image transformation.

Bayesian Active Model Selection with an Application to Automated Audiometry

no code implementations NeurIPS 2015 Jacob Gardner, Gustavo Malkomes, Roman Garnett, Kilian Q. Weinberger, Dennis Barbour, John P. Cunningham

Using this and a previously published model for healthy responses, the proposed method is shown to be capable of diagnosing the presence or absence of NIHL with drastically fewer samples than existing approaches.

Model Selection

Cannot find the paper you are looking for? You can Submit a new open access paper.