1 code implementation • 17 Apr 2023 • Freddie Bickford Smith, Andreas Kirsch, Sebastian Farquhar, Yarin Gal, Adam Foster, Tom Rainforth
Information-theoretic approaches to active learning have traditionally focused on maximising the information gathered about the model parameters, most commonly by optimising the BALD score.
no code implementations • 28 Feb 2023 • Tom Rainforth, Adam Foster, Desi R Ivanova, Freddie Bickford Smith
Bayesian experimental design (BED) provides a powerful and general framework for optimizing the design of experiments.
1 code implementation • 27 Feb 2023 • Desi R. Ivanova, Joel Jennings, Tom Rainforth, Cheng Zhang, Adam Foster
We formalize the problem of contextual optimization through the lens of Bayesian experimental design and propose CO-BED -- a general, model-agnostic framework for designing contextual experiments using information-theoretic principles.
1 code implementation • 21 Feb 2023 • Yashas Annadani, Panagiotis Tigas, Desi R. Ivanova, Andrew Jesson, Yarin Gal, Adam Foster, Stefan Bauer
We introduce a gradient-based approach for the problem of Bayesian optimal experimental design to learn causal models in a batch setting -- a critical component for causal discovery from finite data where interventions can be costly or risky.
no code implementations • 12 Jul 2022 • Desi R. Ivanova, Joel Jennings, Cheng Zhang, Adam Foster
In this paper we introduce a model-agnostic framework for gathering data to evaluate and improve contextual decision making through Bayesian Experimental Design.
1 code implementation • 31 May 2022 • Ning Miao, Tom Rainforth, Emile Mathieu, Yann Dubois, Yee Whye Teh, Adam Foster, Hyunjik Kim
We introduce InstaAug, a method for automatically learning input-specific augmentations from data.
1 code implementation • 4 Feb 2022 • Tomas Geffner, Javier Antoran, Adam Foster, Wenbo Gong, Chao Ma, Emre Kiciman, Amit Sharma, Angus Lamb, Martin Kukla, Nick Pawlowski, Miltiadis Allamanis, Cheng Zhang
Causal inference is essential for data-driven decision making across domains such as business engagement, medical treatment and policy making.
1 code implementation • NeurIPS 2021 • Desi R. Ivanova, Adam Foster, Steven Kleinegesse, Michael U. Gutmann, Tom Rainforth
We introduce implicit Deep Adaptive Design (iDAD), a new method for performing adaptive experiments in real-time with implicit models.
1 code implementation • NeurIPS 2021 • Emile Mathieu, Adam Foster, Yee Whye Teh
Learning representations of stochastic processes is an emerging problem in machine learning with applications from meta-learning to physical object models to time series.
1 code implementation • NeurIPS 2021 • Adam Foster, Árpi Vezér, Craig A Glastonbury, Páidí Creed, Sam Abujudeh, Aaron Sim
Learning meaningful representations of data that can address challenges such as batch effect correction and counterfactual inference is a central problem in many domains including computational biology.
1 code implementation • 3 Mar 2021 • Adam Foster, Desi R. Ivanova, Ilyas Malik, Tom Rainforth
We introduce Deep Adaptive Design (DAD), a method for amortizing the cost of adaptive Bayesian experimental design that allows experiments to be run in real-time.
2 code implementations • ICLR 2021 • Adam Foster, Rattana Pukdee, Tom Rainforth
We propose methods to strengthen the invariance properties of representations obtained by contrastive learning.
1 code implementation • 18 May 2020 • Takashi Goda, Tomohiko Hironaka, Wataru Kitade, Adam Foster
In this paper, applying the idea of randomized multilevel Monte Carlo (MLMC) methods, we introduce an unbiased Monte Carlo estimator for the gradient of the expected information gain with finite expected squared $\ell_2$-norm and finite expected computational cost per sample.
1 code implementation • 1 Nov 2019 • Adam Foster, Martin Jankowiak, Matthew O'Meara, Yee Whye Teh, Tom Rainforth
We introduce a fully stochastic gradient based approach to Bayesian optimal experimental design (BOED).
1 code implementation • NeurIPS 2019 • Adam Foster, Martin Jankowiak, Eli Bingham, Paul Horsfall, Yee Whye Teh, Tom Rainforth, Noah Goodman
Bayesian optimal experimental design (BOED) is a principled framework for making efficient use of limited experimental resources.
1 code implementation • 9 Jul 2018 • Benjamin Bloem-Reddy, Adam Foster, Emile Mathieu, Yee Whye Teh
Empirical evidence suggests that heavy-tailed degree distributions occurring in many real networks are well-approximated by power laws with exponents $\eta$ that may take values either less than and greater than two.
no code implementations • TACL 2018 • Dell Zhang, Jiahao Yuan, Xiaoling Wang, Adam Foster
In data-to-text Natural Language Generation (NLG) systems, computers need to find the right words to describe phenomena seen in the data.