no code implementations • 29 Jun 2021 • Jacob Miller, Geoffrey Roeder, Tai-Danae Bradley
We first prove that applying decoherence to the entirety of a BM model converts it into a discrete UGM, and conversely, that any subgraph of a discrete UGM can be represented as a decohered BM.
no code implementations • 1 Jul 2020 • Geoffrey Roeder, Luke Metz, Diederik P. Kingma
Identifiability is a desirable property of a statistical model: it implies that the true model parameters may be estimated to any desired precision, given sufficient computational resources and data.
no code implementations • NeurIPS 2020 • Alex Beatson, Jordan T. Ash, Geoffrey Roeder, Tianju Xue, Ryan P. Adams
We use a neural network to model the stored potential energy in a component given boundary conditions.
1 code implementation • 28 May 2019 • Geoffrey Roeder, Paul K. Grant, Andrew Phillips, Neil Dalchau, Edward Meeds
Our model class is a generalisation of nonlinear mixed-effects (NLME) dynamical systems, the statistical workhorse for many experimental sciences.
7 code implementations • ICLR 2018 • Will Grathwohl, Dami Choi, Yuhuai Wu, Geoffrey Roeder, David Duvenaud
Gradient-based optimization is the foundation of deep learning and reinforcement learning.
1 code implementation • NeurIPS 2017 • Geoffrey Roeder, Yuhuai Wu, David Duvenaud
We propose a simple and general variant of the standard reparameterized gradient estimator for the variational evidence lower bound.