Search Results for author: Ole Winther

Found 47 papers, 20 papers with code

Few-Shot Diffusion Models

no code implementations30 May 2022 Giorgio Giannone, Didrik Nielsen, Ole Winther

At test time, the model is able to generate samples from previously unseen classes conditioned on as few as 5 samples from that class.

Denoising Few-Shot Learning

Inductive Biases for Object-Centric Representations in the Presence of Complex Textures

no code implementations18 Apr 2022 Samuele Papa, Ole Winther, Andrea Dittadi

Understanding which inductive biases could be helpful for the unsupervised learning of object-centric representations of natural scenes is challenging.

Style Transfer

DermX: an end-to-end framework for explainable automated dermatological diagnosis

no code implementations14 Feb 2022 Raluca Jalaboi, Frederik Faye, Mauricio Orbes-Arteaga, Dan Jørgensen, Ole Winther, Alfiia Galimzianova

We assess the explanation plausibility in terms of identification and localization, by comparing model-selected with dermatologist-selected explanations, and gradient-weighted class-activation maps with dermatologist explanation maps.

Hierarchical Few-Shot Generative Models

1 code implementation23 Oct 2021 Giorgio Giannone, Ole Winther

A few-shot generative model should be able to generate data from a distribution by only observing a limited set of examples.

Few-Shot Learning Out-of-Distribution Generalization

Calibrated Uncertainty for Molecular Property Prediction using Ensembles of Message Passing Neural Networks

no code implementations13 Jul 2021 Jonas Busk, Peter Bjørn Jørgensen, Arghya Bhowmik, Mikkel N. Schmidt, Ole Winther, Tejs Vegge

In this work we extend a message passing neural network designed specifically for predicting properties of molecules and materials with a calibrated probabilistic predictive distribution.

Decision Making Molecular Property Prediction

The Role of Pretrained Representations for the OOD Generalization of Reinforcement Learning Agents

no code implementations ICLR 2022 Andrea Dittadi, Frederik Träuble, Manuel Wüthrich, Felix Widmaier, Peter Gehler, Ole Winther, Francesco Locatello, Olivier Bachem, Bernhard Schölkopf, Stefan Bauer

By training 240 representations and over 10, 000 reinforcement learning (RL) policies on a simulated robotic setup, we evaluate to what extent different properties of pretrained VAE-based representations affect the OOD generalization of downstream agents.

reinforcement-learning Representation Learning

Generalization and Robustness Implications in Object-Centric Learning

1 code implementation1 Jul 2021 Andrea Dittadi, Samuele Papa, Michele De Vita, Bernhard Schölkopf, Ole Winther, Francesco Locatello

The idea behind object-centric representation learning is that natural scenes can better be modeled as compositions of objects and their relations as opposed to distributed representations.

Inductive Bias Representation Learning +1

Optimal Variance Control of the Score-Function Gradient Estimator for Importance-Weighted Bounds

1 code implementation NeurIPS 2020 Valentin Liévin, Andrea Dittadi, Anders Christensen, Ole Winther

Empirically, for the training of both continuous and discrete generative models, the proposed method yields superior variance reduction, resulting in an SNR for IWAE that increases with $K$ without relying on the reparameterization trick.

On the Transfer of Disentangled Representations in Realistic Settings

no code implementations ICLR 2021 Andrea Dittadi, Frederik Träuble, Francesco Locatello, Manuel Wüthrich, Vaibhav Agrawal, Ole Winther, Stefan Bauer, Bernhard Schölkopf

Learning meaningful representations that disentangle the underlying structure of the data generating process is considered to be of key importance in machine learning.


Optimal Variance Control of the Score Function Gradient Estimator for Importance Weighted Bounds

1 code implementation5 Aug 2020 Valentin Liévin, Andrea Dittadi, Anders Christensen, Ole Winther

This paper introduces novel results for the score function gradient estimator of the importance weighted variational bound (IWAE).

SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

2 code implementations NeurIPS 2020 Didrik Nielsen, Priyank Jaini, Emiel Hoogeboom, Ole Winther, Max Welling

Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions.

Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow

1 code implementation NeurIPS 2020 Didrik Nielsen, Ole Winther

Flow models have recently made great progress at modeling ordinal discrete data such as images and audio.

Towards Hierarchical Discrete Variational Autoencoders

no code implementations pproximateinference AABI Symposium 2019 Valentin Liévin, Andrea Dittadi, Lars Maaløe, Ole Winther

We introduce the Hierarchical Discrete Variational Autoencoder (HD-VAE): a hi- erarchy of variational memory layers.

LAVAE: Disentangling Location and Appearance

no code implementations25 Sep 2019 Andrea Dittadi, Ole Winther

We propose a probabilistic generative model for unsupervised learning of structured, interpretable, object-based representations of visual scenes.

Variational Inference

BIVA: A Very Deep Hierarchy of Latent Variables for Generative Modeling

2 code implementations NeurIPS 2019 Lars Maaløe, Marco Fraccaro, Valentin Liévin, Ole Winther

In this paper we close the performance gap by constructing VAE models that can effectively utilize a deep hierarchy of stochastic variables and model complex covariance structures.

Ranked #17 on Image Generation on ImageNet 32x32 (bpd metric)

Anomaly Detection Image Generation

Attend, Copy, Parse -- End-to-end information extraction from documents

2 code implementations18 Dec 2018 Rasmus Berg Palm, Florian Laws, Ole Winther

We believe our proposed architecture can be used on many real life information extraction tasks where word classification cannot be used due to a lack of the required word-level labels.

Classification General Classification

Feature Map Variational Auto-Encoders

no code implementations ICLR 2018 Lars Maaløe, Ole Winther

There have been multiple attempts with variational auto-encoders (VAE) to learn powerful global representations of complex data using a combination of latent stochastic variables and an autoregressive model over the dimensions of the data.

Image Generation

Recurrent Relational Networks for complex relational reasoning

1 code implementation ICLR 2018 Rasmus Berg Palm, Ulrich Paquet, Ole Winther

Humans possess an ability to abstractly reason about objects and their interactions, an ability not shared with state-of-the-art deep learning models.

Relational Reasoning

Recurrent Relational Networks

6 code implementations NeurIPS 2018 Rasmus Berg Palm, Ulrich Paquet, Ole Winther

We achieve state of the art results on the bAbI textual question-answering dataset with the recurrent relational network, consistently solving 20/20 tasks.

Ranked #3 on Question Answering on bAbi (Mean Error Rate metric)

Question Answering Relational Reasoning

A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning

1 code implementation NeurIPS 2017 Marco Fraccaro, Simon Kamronn, Ulrich Paquet, Ole Winther

This paper takes a step towards temporal reasoning in a dynamically changing video, not in the pixel space that constitutes its frames, but in a latent space that describes the non-linear dynamics of the objects in its world.


Hash Embeddings for Efficient Word Representations

no code implementations NeurIPS 2017 Dan Svenstrup, Jonas Meinertz Hansen, Ole Winther

In hash embeddings each token is represented by $k$ $d$-dimensional embeddings vectors and one $k$ dimensional weight vector.

CloudScan - A configuration-free invoice analysis system using recurrent neural networks

1 code implementation24 Aug 2017 Rasmus Berg Palm, Ole Winther, Florian Laws

We describe a recurrent neural network model that can capture long range context and compare it to a baseline logistic regression model corresponding to the current CloudScan production system.

End-to-End Information Extraction without Token-Level Supervision

1 code implementation WS 2017 Rasmus Berg Palm, Dirk Hovy, Florian Laws, Ole Winther

End-to-end (E2E) models, which take raw text as input and produce the desired output directly, need not depend on token-level labels.

Semi-Supervised Generation with Cluster-aware Generative Models

no code implementations3 Apr 2017 Lars Maaløe, Marco Fraccaro, Ole Winther

Deep generative models trained with large amounts of unlabelled data have proven to be powerful within the domain of unsupervised learning.

General Classification

Self-Averaging Expectation Propagation

no code implementations23 Aug 2016 Burak Çakmak, Manfred Opper, Bernard H. Fleury, Ole Winther

Our approach extends the framework of (generalized) approximate message passing -- assumes zero-mean iid entries of the measurement matrix -- to a general class of random matrix ensembles.

Bayesian Inference

An Adaptive Resample-Move Algorithm for Estimating Normalizing Constants

no code implementations7 Apr 2016 Marco Fraccaro, Ulrich Paquet, Ole Winther

The estimation of normalizing constants is a fundamental step in probabilistic model comparison.

Auxiliary Deep Generative Models

1 code implementation17 Feb 2016 Lars Maaløe, Casper Kaae Sønderby, Søren Kaae Sønderby, Ole Winther

The auxiliary variables leave the generative model unchanged but make the variational distribution more expressive.

Autoencoding beyond pixels using a learned similarity metric

28 code implementations31 Dec 2015 Anders Boesen Lindbo Larsen, Søren Kaae Sønderby, Hugo Larochelle, Ole Winther

We present an autoencoder that leverages learned representations to better measure similarities in data space.


Recurrent Spatial Transformer Networks

2 code implementations17 Sep 2015 Søren Kaae Sønderby, Casper Kaae Sønderby, Lars Maaløe, Ole Winther

We investigate different down-sampling factors (ratio of pixel in input and output) for the SPN and show that the RNN-SPN model is able to down-sample the input images without deteriorating performance.

Bayesian inference for spatio-temporal spike-and-slab priors

no code implementations15 Sep 2015 Michael Riis Andersen, Aki Vehtari, Ole Winther, Lars Kai Hansen

In this work, we address the problem of solving a series of underdetermined linear inverse problems subject to a sparsity constraint.

Bayesian Inference

Spatio-temporal Spike and Slab Priors for Multiple Measurement Vector Problems

no code implementations19 Aug 2015 Michael Riis Andersen, Ole Winther, Lars Kai Hansen

We are interested in solving the multiple measurement vector (MMV) problem for instances, where the underlying sparsity pattern exhibit spatio-temporal structure motivated by the electroencephalogram (EEG) source localization problem.


Deep Belief Nets for Topic Modeling

no code implementations18 Jan 2015 Lars Maaloe, Morten Arngren, Ole Winther

Applying traditional collaborative filtering to digital publishing is challenging because user data is very sparse due to the high volume of documents relative to the number of users.

Collaborative Filtering

Protein Secondary Structure Prediction with Long Short Term Memory Networks

no code implementations25 Dec 2014 Søren Kaae Sønderby, Ole Winther

Recurrent neural networks are an generalization of the feed forward neural network that naturally handle sequential data.

Protein Secondary Structure Prediction

Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models

no code implementations23 Dec 2014 Aki Vehtari, Tommi Mononen, Ville Tolvanen, Tuomas Sivula, Ole Winther

The future predictive performance of a Bayesian model can be estimated using Bayesian cross-validation.

Bayesian Inference for Structured Spike and Slab Priors

no code implementations NeurIPS 2014 Michael R. Andersen, Ole Winther, Lars K. Hansen

Sparse signal recovery addresses the problem of solving underdetermined linear inverse problems subject to a sparsity constraint.

Bayesian Inference

Scalable Bayesian Modelling of Paired Symbols

no code implementations9 Sep 2014 Ulrich Paquet, Noam Koenigstein, Ole Winther

We present a novel, scalable and Bayesian approach to modelling the occurrence of pairs of symbols (i, j) drawn from a large vocabulary.

Perturbative Corrections for Approximate Inference in Gaussian Latent Variable Models

no code implementations12 Jan 2013 Manfred Opper, Ulrich Paquet, Ole Winther

A perturbative expansion is made of the exact but intractable correction, and can be applied to the model's partition function and other moments of interest.

Bayesian Sparse Factor Models and DAGs Inference and Comparison

no code implementations NeurIPS 2009 Ricardo Henao, Ole Winther

In this paper we present a novel approach to learn directed acyclic graphs (DAG) and factor models within the same framework while also allowing for model comparison between them.

Improving on Expectation Propagation

no code implementations NeurIPS 2008 Manfred Opper, Ulrich Paquet, Ole Winther

We develop as series of corrections to Expectation Propagation (EP), which is one of the most popular methods for approximate probabilistic inference.

Cannot find the paper you are looking for? You can Submit a new open access paper.