1 code implementation • 13 Jul 2022 • Gregory Benton, Wesley J. Maddox, Andrew Gordon Wilson
A broad class of stochastic volatility models are defined by systems of stochastic differential equations.
1 code implementation • 23 Feb 2022 • Sanae Lotfi, Pavel Izmailov, Gregory Benton, Micah Goldblum, Andrew Gordon Wilson
We provide a partial remedy through a conditional marginal likelihood, which we show is more aligned with generalization, and practically valuable for large-scale hyperparameter learning, such as in deep kernel learning.
1 code implementation • NeurIPS 2021 • Marc Finzi, Gregory Benton, Andrew Gordon Wilson
There is often a trade-off between building deep learning systems that are expressive enough to capture the nuances of the reality, and having the right inductive biases for efficient learning.
no code implementations • 1 Jan 2021 • Gregory Benton, Wesley Maddox, Andrew Gordon Wilson
Neural networks appear to have mysterious generalization properties when using parameter counting as a proxy for complexity.
no code implementations • NeurIPS 2020 • Gregory Benton, Marc Finzi, Pavel Izmailov, Andrew Gordon Wilson
Invariances to translations have imbued convolutional neural networks with powerful generalization properties.
1 code implementation • 22 Oct 2020 • Gregory Benton, Marc Finzi, Pavel Izmailov, Andrew Gordon Wilson
Invariances to translations have imbued convolutional neural networks with powerful generalization properties.
1 code implementation • 4 Mar 2020 • Wesley J. Maddox, Gregory Benton, Andrew Gordon Wilson
Neural networks appear to have mysterious generalization properties when using parameter counting as a proxy for complexity.