2 code implementations • 28 Jun 2024 • Samuel Stanton, Robert Alberstein, Nathan Frey, Andrew Watkins, Kyunghyun Cho
There is a growing body of work seeking to replicate the success of machine learning (ML) on domains like computer vision (CV) and natural language processing (NLP) to applications involving biophysical data.
1 code implementation • 10 May 2024 • Drew Prinster, Samuel Stanton, Anqi Liu, Suchi Saria
As artificial intelligence (AI) / machine learning (ML) gain widespread adoption, practitioners are increasingly seeking means to quantify and control the risk these systems incur.
1 code implementation • NeurIPS 2023 • Nate Gruver, Samuel Stanton, Nathan C. Frey, Tim G. J. Rudner, Isidro Hotzel, Julien Lafrance-Vanasse, Arvind Rajpal, Kyunghyun Cho, Andrew Gordon Wilson
A popular approach to protein design is to combine a generative model with a discriminative model for conditional sampling.
1 code implementation • NeurIPS 2023 • Ryan-Rhys Griffiths, Leo Klarner, Henry B. Moss, Aditya Ravuri, Sang Truong, Samuel Stanton, Gary Tom, Bojana Rankovic, Yuanqi Du, Arian Jamasb, Aryan Deshwal, Julius Schwartz, Austin Tripp, Gregory Kell, Simon Frieder, Anthony Bourached, Alex Chan, Jacob Moss, Chengzhi Guo, Johannes Durholt, Saudamini Chaurasia, Felix Strieth-Kalthoff, Alpha A. Lee, Bingqing Cheng, Alán Aspuru-Guzik, Philippe Schwaller, Jian Tang
By defining such kernels in GAUCHE, we seek to open the door to powerful tools for uncertainty quantification and Bayesian optimisation in chemistry.
1 code implementation • 22 Oct 2022 • Samuel Stanton, Wesley Maddox, Andrew Gordon Wilson
Bayesian optimization is a coherent, ubiquitous approach to decision-making under uncertainty, with applications including multi-arm bandits, active learning, and black-box optimization.
no code implementations • 8 Oct 2022 • Ji Won Park, Samuel Stanton, Saeed Saremi, Andrew Watkins, Henri Dwyer, Vladimir Gligorijevic, Richard Bonneau, Stephen Ra, Kyunghyun Cho
Bayesian optimization offers a sample-efficient framework for navigating the exploration-exploitation trade-off in the vast design space of biological sequences.
2 code implementations • 23 Mar 2022 • Samuel Stanton, Wesley Maddox, Nate Gruver, Phillip Maffettone, Emily Delaney, Peyton Greenside, Andrew Gordon Wilson
Bayesian optimization (BayesOpt) is a gold standard for query-efficient continuous optimization.
1 code implementation • ICLR 2022 • Nate Gruver, Marc Finzi, Samuel Stanton, Andrew Gordon Wilson
Physics-inspired neural networks (NNs), such as Hamiltonian or Lagrangian NNs, dramatically outperform other learned dynamics models by leveraging strong inductive biases.
1 code implementation • NeurIPS 2021 • Wesley J. Maddox, Samuel Stanton, Andrew Gordon Wilson
With a principled representation of uncertainty and closed form posterior updates, Gaussian processes (GPs) are a natural choice for online decision making.
2 code implementations • NeurIPS 2021 • Samuel Stanton, Pavel Izmailov, Polina Kirichenko, Alexander A. Alemi, Andrew Gordon Wilson
Knowledge distillation is a popular technique for training a small student network to emulate a larger teacher model, such as an ensemble of networks.
2 code implementations • 2 Mar 2021 • Samuel Stanton, Wesley J. Maddox, Ian Delbridge, Andrew Gordon Wilson
Gaussian processes (GPs) provide a gold standard for performance in online settings, such as sample-efficient control and black box optimization, where we need to update a posterior distribution as we acquire data in a sequential fashion.
1 code implementation • 28 Aug 2020 • Brandon Amos, Samuel Stanton, Denis Yarats, Andrew Gordon Wilson
For over a decade, model-based reinforcement learning has been seen as a way to leverage control-based domain knowledge to improve the sample-efficiency of reinforcement learning agents.
2 code implementations • ICML 2020 • Marc Finzi, Samuel Stanton, Pavel Izmailov, Andrew Gordon Wilson
The translation equivariance of convolutional layers enables convolutional neural networks to generalize well on image problems.