1 code implementation • 15 Jun 2022 • Tobias Höppe, Arash Mehrjou, Stefan Bauer, Didrik Nielsen, Andrea Dittadi
By varying the mask we condition on, the model is able to perform video prediction, infilling, and upsampling.
Ranked #2 on Video Generation on BAIR Robot Pushing
1 code implementation • 30 May 2022 • Giorgio Giannone, Didrik Nielsen, Ole Winther
At test time, the model is able to generate samples from previously unseen classes conditioned on as few as 5 samples from that class.
2 code implementations • NeurIPS 2021 • Emiel Hoogeboom, Didrik Nielsen, Priyank Jaini, Patrick Forré, Max Welling
Argmax Flows are defined by a composition of a continuous distribution (such as a normalizing flow), and an argmax function.
1 code implementation • 4 Feb 2021 • Priyank Jaini, Didrik Nielsen, Max Welling
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
no code implementations • pproximateinference AABI Symposium 2021 • Emiel Hoogeboom, Didrik Nielsen, Priyank Jaini, Patrick Forré, Max Welling
This paper introduces a new method to define and train continuous distributions such as normalizing flows directly on categorical data, for example text and image segmentation.
3 code implementations • NeurIPS 2020 • Didrik Nielsen, Priyank Jaini, Emiel Hoogeboom, Ole Winther, Max Welling
Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions.
1 code implementation • NeurIPS 2020 • Didrik Nielsen, Ole Winther
Flow models have recently made great progress at modeling ordinal discrete data such as images and audio.
2 code implementations • NeurIPS 2018 • Aaron Mishkin, Frederik Kunstner, Didrik Nielsen, Mark Schmidt, Mohammad Emtiyaz Khan
Uncertainty estimation in large deep-learning models is a computationally challenging task, where it is difficult to form even a Gaussian approximation to the posterior distribution.
1 code implementation • 12 Jul 2018 • Mohammad Emtiyaz Khan, Didrik Nielsen
Bayesian inference plays an important role in advancing machine learning, but faces computational challenges when applied to complex models such as deep neural networks.
3 code implementations • ICML 2018 • Mohammad Emtiyaz Khan, Didrik Nielsen, Voot Tangkaratt, Wu Lin, Yarin Gal, Akash Srivastava
Uncertainty computation in deep learning is essential to design robust and reliable systems.
no code implementations • 15 Nov 2017 • Mohammad Emtiyaz Khan, Wu Lin, Voot Tangkaratt, Zuozhu Liu, Didrik Nielsen
We present the Variational Adaptive Newton (VAN) method which is a black-box optimization method especially suitable for explorative-learning tasks such as active learning and reinforcement learning.