Search Results for author: Roman Garnett

Found 33 papers, 15 papers with code

The Behavior and Convergence of Local Bayesian Optimization

1 code implementation NeurIPS 2023 Kaiwen Wu, Kyurae Kim, Roman Garnett, Jacob R. Gardner

A recent development in Bayesian optimization is the use of local optimization strategies, which can deliver strong empirical performance on high-dimensional problems compared to traditional global strategies.

Bayesian Optimization

A Visual Active Search Framework for Geospatial Exploration

1 code implementation28 Nov 2022 Anindya Sarkar, Michael Lanier, Scott Alfeld, Jiarui Feng, Roman Garnett, Nathan Jacobs, Yevgeniy Vorobeychik

Many problems can be viewed as forms of geospatial search aided by aerial imagery, with examples ranging from detecting poaching activity to human trafficking.

Domain Adaptation

Local Bayesian optimization via maximizing probability of descent

1 code implementation21 Oct 2022 Quan Nguyen, Kaiwen Wu, Jacob R. Gardner, Roman Garnett

Local optimization presents a promising approach to expensive, high-dimensional black-box optimization by sidestepping the need to globally explore the search space.

Bayesian Optimization Navigate

A Unified Comparison of User Modeling Techniques for Predicting Data Interaction and Detecting Exploration Bias

1 code implementation9 Aug 2022 Sunwoo Ha, Shayan Monadjemi, Roman Garnett, Alvitta Ottley

Our paper seeks to fill in this missing gap by comparing and ranking eight user modeling algorithms based on their performance on a diverse set of four user study datasets.

Bias Detection Data Interaction

Nonmyopic Multiclass Active Search with Diminishing Returns for Diverse Discovery

no code implementations8 Feb 2022 Quan Nguyen, Roman Garnett

Active search is a setting in adaptive experimental design where we aim to uncover members of rare, valuable class(es) subject to a budget constraint.

Drug Discovery Experimental Design

Nonmyopic Multifidelity Active Search

1 code implementation11 Jun 2021 Quan Nguyen, Arghavan Modiri, Roman Garnett

Active search is a learning paradigm where we seek to identify as many members of a rare, valuable class as possible given a labeling budget.

Efficient Nonmyopic Bayesian Optimization via One-Shot Multi-Step Trees

1 code implementation NeurIPS 2020 Shali Jiang, Daniel R. Jiang, Maximilian Balandat, Brian Karrer, Jacob R. Gardner, Roman Garnett

In this paper, we provide the first efficient implementation of general multi-step lookahead Bayesian optimization, formulated as a sequence of nested optimization problems within a multi-step scenario tree.

Bayesian Optimization Decision Making

GPIRT: A Gaussian Process Model for Item Response Theory

no code implementations17 Jun 2020 JBrandon Duck-Mayr, Roman Garnett, Jacob M. Montgomery

This allows us to simultaneously relax assumptions about the shape of the IRFs while preserving the ability to estimate latent traits.

Active Learning

Automated Measurement of Quasar Redshift with a Gaussian Process

2 code implementations12 Jun 2020 Leah Fauber, Ming-Feng Ho, Simeon Bird, Christian R. Shelton, Roman Garnett, Ishita Korde

Our technique is an extension of an earlier Gaussian process method for detecting damped Lyman-alpha absorbers (DLAs) in quasar spectra with known redshifts.

Astrophysics of Galaxies Instrumentation and Methods for Astrophysics

Detecting Multiple DLAs per Spectrum in SDSS DR12 with Gaussian Processes

1 code implementation24 Mar 2020 Ming-Feng Ho, Simeon Bird, Roman Garnett

We present a revised version of our automated technique using Gaussian processes (GPs) to detect Damped Lyman-$\alpha$ absorbers (DLAs) along quasar (QSO) sightlines.

Cosmology and Nongalactic Astrophysics Astrophysics of Galaxies Data Analysis, Statistics and Probability

Cost Effective Active Search

1 code implementation NeurIPS 2019 Shali Jiang, Roman Garnett, Benjamin Moseley

We study a special paradigm of active learning, called cost effective active search, where the goal is to find a given number of positive points from a large unlabeled pool with minimum labeling cost.

Active Learning

BINOCULARS for Efficient, Nonmyopic Sequential Experimental Design

1 code implementation ICML 2020 Shali Jiang, Henry Chai, Javier Gonzalez, Roman Garnett

Finite-horizon sequential experimental design (SED) arises naturally in many contexts, including hyperparameter tuning in machine learning among more traditional settings.

Bayesian Optimization Experimental Design

Automated Model Selection with Bayesian Quadrature

no code implementations26 Feb 2019 Henry Chai, Jean-Francois Ton, Roman Garnett, Michael A. Osborne

We present a novel technique for tailoring Bayesian quadrature (BQ) to model selection.

Model Selection

Efficient nonmyopic batch active search

no code implementations NeurIPS 2018 Shali Jiang, Gustavo Malkomes, Matthew Abbott, Benjamin Moseley, Roman Garnett

A critical target scenario is high-throughput screening for scientific discovery, such as drug or materials discovery.

Drug Discovery

Efficient nonmyopic active search with applications in drug and materials discovery

no code implementations21 Nov 2018 Shali Jiang, Gustavo Malkomes, Benjamin Moseley, Roman Garnett

We also study the batch setting for the first time, where a batch of $b>1$ points can be queried at each iteration.

Drug Discovery

Improving Quadrature for Constrained Integrands

1 code implementation13 Feb 2018 Henry Chai, Roman Garnett

We focus on quadrature with nonnegative functions, a common task in Bayesian inference.

Bayesian Inference

Efficient Nonmyopic Active Search

no code implementations ICML 2017 Shali Jiang, Gustavo Malkomes, Geoff Converse, Alyssa Shofner, Benjamin Moseley, Roman Garnett

Active search is an active learning setting with the goal of identifying as many members of a given class as possible under a labeling budget.

Active Learning Drug Discovery

Active Search for Sparse Signals with Region Sensing

no code implementations2 Dec 2016 Yifei Ma, Roman Garnett, Jeff Schneider

Autonomous systems can be used to search for sparse signals in a large space; e. g., aerial robots can be deployed to localize threats, detect gas leaks, or respond to distress calls.

Bayesian Optimization Compressive Sensing +1

Bayesian optimization for automated model selection

no code implementations NeurIPS 2016 Gustavo Malkomes, Charles Schaff, Roman Garnett

Despite the success of kernel-based nonparametric methods, kernel selection still requires considerable expertise, and is often described as a “black art.” We present a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices.

Bayesian Optimization Model Selection

Exact Sampling from Determinantal Point Processes

no code implementations22 Sep 2016 Philipp Hennig, Roman Garnett

Determinantal point processes (DPPs) are an important concept in random matrix theory and combinatorics.

Active Learning Bayesian Optimization +2

Detecting Damped Lyman-$α$ Absorbers with Gaussian Processes

4 code implementations14 May 2016 Roman Garnett, Shirley Ho, Simeon Bird, Jeff Schneider

We develop an automated technique for detecting damped Lyman-$\alpha$ absorbers (DLAs) along spectroscopic lines of sight to quasi-stellar objects (QSOs or quasars).

Cosmology and Nongalactic Astrophysics Data Analysis, Statistics and Probability

Bayesian Active Model Selection with an Application to Automated Audiometry

no code implementations NeurIPS 2015 Jacob Gardner, Gustavo Malkomes, Roman Garnett, Kilian Q. Weinberger, Dennis Barbour, John P. Cunningham

Using this and a previously published model for healthy responses, the proposed method is shown to be capable of diagnosing the presence or absence of NIHL with drastically fewer samples than existing approaches.

Model Selection

Differentially Private Bayesian Optimization

no code implementations16 Jan 2015 Matt J. Kusner, Jacob R. Gardner, Roman Garnett, Kilian Q. Weinberger

The success of machine learning has led practitioners in diverse real-world settings to learn classifiers for practical problems.

Bayesian Optimization BIG-bench Machine Learning

Sampling for Inference in Probabilistic Models with Fast Bayesian Quadrature

no code implementations NeurIPS 2014 Tom Gunter, Michael A. Osborne, Roman Garnett, Philipp Hennig, Stephen J. Roberts

We propose a novel sampling framework for inference in probabilistic models: an active learning approach that converges more quickly (in wall-clock time) than Markov chain Monte Carlo (MCMC) benchmarks.

Active Learning Numerical Integration

Propagation Kernels

1 code implementation13 Oct 2014 Marion Neumann, Roman Garnett, Christian Bauckhage, Kristian Kersting

We introduce propagation kernels, a general graph-kernel framework for efficiently measuring the similarity of structured data.

Σ-Optimality for Active Learning on Gaussian Random Fields

no code implementations NeurIPS 2013 Yifei Ma, Roman Garnett, Jeff Schneider

For active learning on GRFs, the commonly used V-optimality criterion queries nodes that reduce the L2 (regression) loss.

Active Learning General Classification

Active Learning of Linear Embeddings for Gaussian Processes

no code implementations24 Oct 2013 Roman Garnett, Michael A. Osborne, Philipp Hennig

We propose an active learning method for discovering low-dimensional structure in high-dimensional Gaussian process (GP) tasks.

Active Learning Bayesian Optimization +2

Bayesian Optimal Active Search and Surveying

no code implementations27 Jun 2012 Roman Garnett, Yamuna Krishnamurthy, Xuehan Xiong, Jeff Schneider, Richard Mann

In the second, active surveying, our goal is to actively query points to ultimately predict the proportion of a given class.

Binary Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.