Search Results for author: Ole Winther

Found 62 papers, 31 papers with code

Coherent energy and force uncertainty in deep learning force fields

no code implementations7 Dec 2023 Peter Bjørn Jørgensen, Jonas Busk, Ole Winther, Mikkel N. Schmidt

In machine learning energy potentials for atomic systems, forces are commonly obtained as the negative derivative of the energy function with respect to atomic positions.

DiffEnc: Variational Diffusion with a Learned Encoder

1 code implementation30 Oct 2023 Beatrix M. G. Nielsen, Anders Christensen, Andrea Dittadi, Ole Winther

Diffusion models may be viewed as hierarchical variational autoencoders (VAEs) with two improvements: parameter sharing for the conditional distributions in the generative process and efficient computation of the loss as independent terms over the hierarchy.

Image-free Classifier Injection for Zero-Shot Classification

1 code implementation ICCV 2023 Anders Christensen, Massimiliano Mancini, A. Sophia Koepke, Ole Winther, Zeynep Akata

We achieve this with our proposed Image-free Classifier Injection with Semantics (ICIS) that injects classifiers for new, unseen classes into pre-trained classification models in a post-hoc fashion without relying on image data.

Classification Image Classification +1

Addressing caveats of neural persistence with deep graph persistence

1 code implementation20 Jul 2023 Leander Girrbach, Anders Christensen, Ole Winther, Zeynep Akata, A. Sophia Koepke

Whilst this captures useful information for linear classifiers, we find that no relevant spatial structure is present in later layers of deep neural networks, making neural persistence roughly equivalent to the variance of weights.

Topological Data Analysis

Implicit Transfer Operator Learning: Multiple Time-Resolution Surrogates for Molecular Dynamics

no code implementations29 May 2023 Mathias Schreiner, Ole Winther, Simon Olsson

Computing properties of molecular systems rely on estimating expectations of the (unnormalized) Boltzmann distribution.

Denoising Operator learning

Graph Neural Network Interatomic Potential Ensembles with Calibrated Aleatoric and Epistemic Uncertainty on Energy and Forces

no code implementations10 May 2023 Jonas Busk, Mikkel N. Schmidt, Ole Winther, Tejs Vegge, Peter Bjørn Jørgensen

The proposed method considers both epistemic and aleatoric uncertainty and the total uncertainties are recalibrated post hoc using a nonlinear scaling function to achieve good calibration on previously unseen data, without loss of predictive accuracy.

Dermatological Diagnosis Explainability Benchmark for Convolutional Neural Networks

1 code implementation23 Feb 2023 Raluca Jalaboi, Ole Winther, Alfiia Galimzianova

We pre-trained all architectures on an clinical skin disease dataset, and fine-tuned them on a DermXDB subset.

Benchmarking Medical Diagnosis

Unifying Molecular and Textual Representations via Multi-task Language Modelling

1 code implementation29 Jan 2023 Dimitrios Christofidellis, Giorgio Giannone, Jannis Born, Ole Winther, Teodoro Laino, Matteo Manica

Here, we propose the first multi-domain, multi-task language model that can solve a wide range of tasks in both the chemical and natural language domains.

Language Modelling Molecule Captioning +2

Variational Open-Domain Question Answering

2 code implementations23 Sep 2022 Valentin Liévin, Andreas Geert Motzfeldt, Ida Riis Jensen, Ole Winther

Retrieval-augmented models have proven to be effective in natural language processing tasks, yet there remains a lack of research on their optimization using variational inference.

Language Modelling Multiple-choice +4

Explainable Image Quality Assessments in Teledermatological Photography

no code implementations10 Sep 2022 Raluca Jalaboi, Ole Winther, Alfiia Galimzianova

For poor image quality explanations, our method obtains F1-scores of between 0. 37 +- 0. 01 and 0. 70 +- 0. 01, similar to the inter-rater pairwise F1-score of between 0. 24 +- 0. 15 and 0. 83 +- 0. 06.

Image Quality Assessment

Transition1x -- a Dataset for Building Generalizable Reactive Machine Learning Potentials

no code implementations25 Jul 2022 Mathias Schreiner, Arghya Bhowmik, Tejs Vegge, Jonas Busk, Ole Winther

In this work, we present the dataset Transition1x containing 9. 6 million Density Functional Theory (DFT) calculations of forces and energies of molecular configurations on and around reaction pathways at the wB97x/6-31G(d) level of theory.

BIG-bench Machine Learning

NeuralNEB -- Neural Networks can find Reaction Paths Fast

no code implementations20 Jul 2022 Mathias Schreiner, Arghya Bhowmik, Tejs Vegge, Peter Bjørn Jørgensen, Ole Winther

We also compare with and outperform Density Functional based Tight Binding (DFTB) on both accuracy and computational resource.

Can large language models reason about medical questions?

1 code implementation17 Jul 2022 Valentin Liévin, Christoffer Egeberg Hother, Andreas Geert Motzfeldt, Ole Winther

Although large language models (LLMs) often produce impressive outputs, it remains unclear how they perform in real-world scenarios requiring strong reasoning skills and expert domain knowledge.

Multiple-choice Multiple Choice Question Answering (MCQA) +3

Few-Shot Diffusion Models

1 code implementation30 May 2022 Giorgio Giannone, Didrik Nielsen, Ole Winther

At test time, the model is able to generate samples from previously unseen classes conditioned on as few as 5 samples from that class.

Denoising Few-Shot Learning

Inductive Biases for Object-Centric Representations in the Presence of Complex Textures

no code implementations18 Apr 2022 Samuele Papa, Ole Winther, Andrea Dittadi

Understanding which inductive biases could be helpful for the unsupervised learning of object-centric representations of natural scenes is challenging.

Object Segmentation +1

DermX: an end-to-end framework for explainable automated dermatological diagnosis

1 code implementation14 Feb 2022 Raluca Jalaboi, Frederik Faye, Mauricio Orbes-Arteaga, Dan Jørgensen, Ole Winther, Alfiia Galimzianova

We assess the explanation performance in terms of identification and localization by comparing model-selected with dermatologist-selected explanations, and gradient-weighted class-activation maps with dermatologist explanation maps, respectively.

SCHA-VAE: Hierarchical Context Aggregation for Few-Shot Generation

1 code implementation23 Oct 2021 Giorgio Giannone, Ole Winther

In few-shot learning the model is trained on data from many sets from distributions sharing some underlying properties such as sets of characters from different alphabets or objects from different categories.

Few-Shot Learning Out-of-Distribution Generalization

Calibrated Uncertainty for Molecular Property Prediction using Ensembles of Message Passing Neural Networks

no code implementations13 Jul 2021 Jonas Busk, Peter Bjørn Jørgensen, Arghya Bhowmik, Mikkel N. Schmidt, Ole Winther, Tejs Vegge

In this work we extend a message passing neural network designed specifically for predicting properties of molecules and materials with a calibrated probabilistic predictive distribution.

BIG-bench Machine Learning Decision Making +2

The Role of Pretrained Representations for the OOD Generalization of Reinforcement Learning Agents

no code implementations ICLR 2022 Andrea Dittadi, Frederik Träuble, Manuel Wüthrich, Felix Widmaier, Peter Gehler, Ole Winther, Francesco Locatello, Olivier Bachem, Bernhard Schölkopf, Stefan Bauer

By training 240 representations and over 10, 000 reinforcement learning (RL) policies on a simulated robotic setup, we evaluate to what extent different properties of pretrained VAE-based representations affect the OOD generalization of downstream agents.

Reinforcement Learning (RL) Representation Learning

Generalization and Robustness Implications in Object-Centric Learning

1 code implementation1 Jul 2021 Andrea Dittadi, Samuele Papa, Michele De Vita, Bernhard Schölkopf, Ole Winther, Francesco Locatello

The idea behind object-centric representation learning is that natural scenes can better be modeled as compositions of objects and their relations as opposed to distributed representations.

Inductive Bias Object +3

Optimal Variance Control of the Score-Function Gradient Estimator for Importance-Weighted Bounds

1 code implementation NeurIPS 2020 Valentin Liévin, Andrea Dittadi, Anders Christensen, Ole Winther

Empirically, for the training of both continuous and discrete generative models, the proposed method yields superior variance reduction, resulting in an SNR for IWAE that increases with $K$ without relying on the reparameterization trick.

On the Transfer of Disentangled Representations in Realistic Settings

no code implementations ICLR 2021 Andrea Dittadi, Frederik Träuble, Francesco Locatello, Manuel Wüthrich, Vaibhav Agrawal, Ole Winther, Stefan Bauer, Bernhard Schölkopf

Learning meaningful representations that disentangle the underlying structure of the data generating process is considered to be of key importance in machine learning.

Disentanglement

Optimal Variance Control of the Score Function Gradient Estimator for Importance Weighted Bounds

1 code implementation5 Aug 2020 Valentin Liévin, Andrea Dittadi, Anders Christensen, Ole Winther

This paper introduces novel results for the score function gradient estimator of the importance weighted variational bound (IWAE).

SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

3 code implementations NeurIPS 2020 Didrik Nielsen, Priyank Jaini, Emiel Hoogeboom, Ole Winther, Max Welling

Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions.

Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow

1 code implementation NeurIPS 2020 Didrik Nielsen, Ole Winther

Flow models have recently made great progress at modeling ordinal discrete data such as images and audio.

Towards Hierarchical Discrete Variational Autoencoders

no code implementations pproximateinference AABI Symposium 2019 Valentin Liévin, Andrea Dittadi, Lars Maaløe, Ole Winther

We introduce the Hierarchical Discrete Variational Autoencoder (HD-VAE): a hi- erarchy of variational memory layers.

LAVAE: Disentangling Location and Appearance

no code implementations25 Sep 2019 Andrea Dittadi, Ole Winther

We propose a probabilistic generative model for unsupervised learning of structured, interpretable, object-based representations of visual scenes.

Object Variational Inference

BIVA: A Very Deep Hierarchy of Latent Variables for Generative Modeling

2 code implementations NeurIPS 2019 Lars Maaløe, Marco Fraccaro, Valentin Liévin, Ole Winther

In this paper we close the performance gap by constructing VAE models that can effectively utilize a deep hierarchy of stochastic variables and model complex covariance structures.

Ranked #18 on Image Generation on ImageNet 32x32 (bpd metric)

Anomaly Detection Attribute +1

Attend, Copy, Parse -- End-to-end information extraction from documents

2 code implementations18 Dec 2018 Rasmus Berg Palm, Florian Laws, Ole Winther

We believe our proposed architecture can be used on many real life information extraction tasks where word classification cannot be used due to a lack of the required word-level labels.

Classification General Classification

Recurrent Relational Networks for complex relational reasoning

1 code implementation ICLR 2018 Rasmus Berg Palm, Ulrich Paquet, Ole Winther

Humans possess an ability to abstractly reason about objects and their interactions, an ability not shared with state-of-the-art deep learning models.

Relational Reasoning

Feature Map Variational Auto-Encoders

no code implementations ICLR 2018 Lars Maaløe, Ole Winther

There have been multiple attempts with variational auto-encoders (VAE) to learn powerful global representations of complex data using a combination of latent stochastic variables and an autoregressive model over the dimensions of the data.

Image Generation

Recurrent Relational Networks

6 code implementations NeurIPS 2018 Rasmus Berg Palm, Ulrich Paquet, Ole Winther

We achieve state of the art results on the bAbI textual question-answering dataset with the recurrent relational network, consistently solving 20/20 tasks.

Ranked #3 on Question Answering on bAbi (Mean Error Rate metric)

Question Answering Relational Reasoning

A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning

1 code implementation NeurIPS 2017 Marco Fraccaro, Simon Kamronn, Ulrich Paquet, Ole Winther

This paper takes a step towards temporal reasoning in a dynamically changing video, not in the pixel space that constitutes its frames, but in a latent space that describes the non-linear dynamics of the objects in its world.

Imputation

Hash Embeddings for Efficient Word Representations

no code implementations NeurIPS 2017 Dan Svenstrup, Jonas Meinertz Hansen, Ole Winther

In hash embeddings each token is represented by $k$ $d$-dimensional embeddings vectors and one $k$ dimensional weight vector.

CloudScan - A configuration-free invoice analysis system using recurrent neural networks

1 code implementation24 Aug 2017 Rasmus Berg Palm, Ole Winther, Florian Laws

We describe a recurrent neural network model that can capture long range context and compare it to a baseline logistic regression model corresponding to the current CloudScan production system.

End-to-End Information Extraction without Token-Level Supervision

1 code implementation WS 2017 Rasmus Berg Palm, Dirk Hovy, Florian Laws, Ole Winther

End-to-end (E2E) models, which take raw text as input and produce the desired output directly, need not depend on token-level labels.

Semi-Supervised Generation with Cluster-aware Generative Models

no code implementations3 Apr 2017 Lars Maaløe, Marco Fraccaro, Ole Winther

Deep generative models trained with large amounts of unlabelled data have proven to be powerful within the domain of unsupervised learning.

Clustering General Classification

Self-Averaging Expectation Propagation

no code implementations23 Aug 2016 Burak Çakmak, Manfred Opper, Bernard H. Fleury, Ole Winther

Our approach extends the framework of (generalized) approximate message passing -- assumes zero-mean iid entries of the measurement matrix -- to a general class of random matrix ensembles.

Bayesian Inference

An Adaptive Resample-Move Algorithm for Estimating Normalizing Constants

no code implementations7 Apr 2016 Marco Fraccaro, Ulrich Paquet, Ole Winther

The estimation of normalizing constants is a fundamental step in probabilistic model comparison.

Auxiliary Deep Generative Models

1 code implementation17 Feb 2016 Lars Maaløe, Casper Kaae Sønderby, Søren Kaae Sønderby, Ole Winther

The auxiliary variables leave the generative model unchanged but make the variational distribution more expressive.

Recurrent Spatial Transformer Networks

2 code implementations17 Sep 2015 Søren Kaae Sønderby, Casper Kaae Sønderby, Lars Maaløe, Ole Winther

We investigate different down-sampling factors (ratio of pixel in input and output) for the SPN and show that the RNN-SPN model is able to down-sample the input images without deteriorating performance.

Attribute

Bayesian inference for spatio-temporal spike-and-slab priors

no code implementations15 Sep 2015 Michael Riis Andersen, Aki Vehtari, Ole Winther, Lars Kai Hansen

In this work, we address the problem of solving a series of underdetermined linear inverse problems subject to a sparsity constraint.

Bayesian Inference

Spatio-temporal Spike and Slab Priors for Multiple Measurement Vector Problems

no code implementations19 Aug 2015 Michael Riis Andersen, Ole Winther, Lars Kai Hansen

We are interested in solving the multiple measurement vector (MMV) problem for instances, where the underlying sparsity pattern exhibit spatio-temporal structure motivated by the electroencephalogram (EEG) source localization problem.

EEG

Deep Belief Nets for Topic Modeling

no code implementations18 Jan 2015 Lars Maaloe, Morten Arngren, Ole Winther

Applying traditional collaborative filtering to digital publishing is challenging because user data is very sparse due to the high volume of documents relative to the number of users.

Collaborative Filtering Retrieval

Protein Secondary Structure Prediction with Long Short Term Memory Networks

no code implementations25 Dec 2014 Søren Kaae Sønderby, Ole Winther

Recurrent neural networks are an generalization of the feed forward neural network that naturally handle sequential data.

Protein Secondary Structure Prediction

Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models

no code implementations23 Dec 2014 Aki Vehtari, Tommi Mononen, Ville Tolvanen, Tuomas Sivula, Ole Winther

The future predictive performance of a Bayesian model can be estimated using Bayesian cross-validation.

Bayesian Inference for Structured Spike and Slab Priors

no code implementations NeurIPS 2014 Michael R. Andersen, Ole Winther, Lars K. Hansen

Sparse signal recovery addresses the problem of solving underdetermined linear inverse problems subject to a sparsity constraint.

Bayesian Inference

Scalable Bayesian Modelling of Paired Symbols

no code implementations9 Sep 2014 Ulrich Paquet, Noam Koenigstein, Ole Winther

We present a novel, scalable and Bayesian approach to modelling the occurrence of pairs of symbols (i, j) drawn from a large vocabulary.

Perturbative Corrections for Approximate Inference in Gaussian Latent Variable Models

no code implementations12 Jan 2013 Manfred Opper, Ulrich Paquet, Ole Winther

A perturbative expansion is made of the exact but intractable correction, and can be applied to the model's partition function and other moments of interest.

Bayesian Sparse Factor Models and DAGs Inference and Comparison

no code implementations NeurIPS 2009 Ricardo Henao, Ole Winther

In this paper we present a novel approach to learn directed acyclic graphs (DAG) and factor models within the same framework while also allowing for model comparison between them.

valid

Improving on Expectation Propagation

no code implementations NeurIPS 2008 Manfred Opper, Ulrich Paquet, Ole Winther

We develop as series of corrections to Expectation Propagation (EP), which is one of the most popular methods for approximate probabilistic inference.

Cannot find the paper you are looking for? You can Submit a new open access paper.