no code implementations • NeurIPS 2020 • Sana Tonekaboni, Shalmali Joshi, Kieran Campbell, David K. Duvenaud, Anna Goldenberg
Explanations of time series models are useful for high stakes applications like healthcare but have received little attention in machine learning literature.
1 code implementation • NeurIPS 2019 • Yulia Rubanova, Tian Qi Chen, David K. Duvenaud
Time series with non-uniform intervals occur in many applications, and are difficult to model using standard recurrent neural networks (RNNs).
no code implementations • NeurIPS 2019 • Tian Qi Chen, David K. Duvenaud
Gradients of neural networks can be computed efficiently for any architecture, but some applications require computing differential operators with higher time complexity.
no code implementations • pproximateinference AABI Symposium 2019 • Xuechen Li, Ting-Kam Leonard Wong, Ricky T. Q. Chen, David K. Duvenaud
We derive reverse-mode (or adjoint) automatic differentiation for solutions of stochastic differential equations (SDEs), allowing time-efficient and constant-memory computation of pathwise gradients, a continuous-time analogue of the reparameterization trick.
no code implementations • NeurIPS 2016 • Eric Schulz, Josh Tenenbaum, David K. Duvenaud, Maarten Speekenbrink, Samuel J. Gershman
How do people learn about complex functional structure?
no code implementations • 16 Dec 2014 • Roger B. Grosse, David K. Duvenaud
Markov Chain Monte Carlo (MCMC) algorithms are a workhorse of probabilistic modeling and inference, but are difficult to debug, and are prone to silent failure if implemented naively.
no code implementations • NeurIPS 2012 • Michael Osborne, Roman Garnett, Zoubin Ghahramani, David K. Duvenaud, Stephen J. Roberts, Carl E. Rasmussen
Numerical integration is an key component of many problems in scientific computing, statistical modelling, and machine learning.