Search Results for author: Oswin Krause

Found 17 papers, 9 papers with code

Reducing Annotation Need in Self-Explanatory Models for Lung Nodule Diagnosis

2 code implementations27 Jun 2022 Jiahao Lu, CHONG YIN, Oswin Krause, Kenny Erleben, Michael Bachmann Nielsen, Sune Darkner

Visualisation of the learned space further indicates that the correlation between the clustering of malignancy and nodule attributes coincides with clinical knowledge.

Clinical Knowledge Contrastive Learning

Learning Coulomb Diamonds in Large Quantum Dot Arrays

1 code implementation3 May 2022 Oswin Krause, Anasua Chatterjee, Ferdinand Kuemmeth, Evert van Nieuwenburg

We introduce an algorithm that is able to find the facets of Coulomb diamonds in quantum dot arrays.

Estimation of Convex Polytopes for Automatic Discovery of Charge State Transitions in Quantum Dot Arrays

no code implementations20 Aug 2021 Oswin Krause, Torbjørn Rasmussen, Bertram Brovang, Anasua Chatterjee, Ferdinand Kuemmeth

In spin based quantum dot arrays, material or fabrication imprecisions affect the behaviour of the device, which must be taken into account when controlling it.

Active Learning

Spot the Difference: Detection of Topological Changes via Geometric Alignment

1 code implementation NeurIPS 2021 Steffen Czolbe, Aasa Feragen, Oswin Krause

As a first step towards solving such alignment problems, we propose an unsupervised algorithm for the detection of changes in image topology.

Domain Adaptation Optical Flow Estimation +1

Is segmentation uncertainty useful?

1 code implementation30 Mar 2021 Steffen Czolbe, Kasra Arnavaz, Oswin Krause, Aasa Feragen

Probabilistic image segmentation encodes varying prediction confidence and inherent ambiguity in the segmentation problem.

Active Learning Image Segmentation +2

Multimodal Variational Autoencoders for Semi-Supervised Learning: In Defense of Product-of-Experts

1 code implementation18 Jan 2021 Svetlana Kutuzova, Oswin Krause, Douglas McCloskey, Mads Nielsen, Christian Igel

Multimodal generative models should be able to learn a meaningful latent representation that enables a coherent joint generation of all modalities (e. g., images and text).

A Loss Function for Generative Neural Networks Based on Watson’s Perceptual Model

1 code implementation NeurIPS 2020 Steffen Czolbe, Oswin Krause, Ingemar Cox, Christian Igel

To train Variational Autoencoders (VAEs) to generate realistic imagery requires a loss function that reflects human perception of image similarity.

Translation

Convergence Analysis of the Hessian Estimation Evolution Strategy

no code implementations6 Sep 2020 Tobias Glasmachers, Oswin Krause

The class of algorithms called Hessian Estimation Evolution Strategies (HE-ESs) update the covariance matrix of their sampling distribution by directly estimating the curvature of the objective function.

A Loss Function for Generative Neural Networks Based on Watson's Perceptual Model

1 code implementation26 Jun 2020 Steffen Czolbe, Oswin Krause, Ingemar Cox, Christian Igel

To train Variational Autoencoders (VAEs) to generate realistic imagery requires a loss function that reflects human perception of image similarity.

Translation

The Hessian Estimation Evolution Strategy

no code implementations30 Mar 2020 Tobias Glasmachers, Oswin Krause

We demonstrate that our approach to covariance matrix adaptation is efficient by evaluation it on the BBOB/COCO testbed.

Unsupervised-Learning of time-varying features

no code implementations25 Sep 2019 Henrik Høeg, Matthias Brix, Oswin Krause

We present an architecture based on the conditional Variational Autoencoder to learn a representation of transformations in time-sequence data.

Car Racing

Convolutional neural networks for segmentation and object detection of human semen

no code implementations3 Apr 2017 Malte Stær Nissen, Oswin Krause, Kristian Almstrup, Søren Kjærulff, Torben Trindkær Nielsen, Mads Nielsen

We compare a set of convolutional neural network (CNN) architectures for the task of segmenting and detecting human sperm cells in an image taken from a semen sample.

object-detection Object Detection

CMA-ES with Optimal Covariance Update and Storage Complexity

no code implementations NeurIPS 2016 Oswin Krause, Dídac Rodríguez Arbonès, Christian Igel

The covariance matrix adaptation evolution strategy (CMA-ES) is arguably one of the most powerful real-valued derivative-free optimization algorithms, finding many applications in machine learning.

Population-Contrastive-Divergence: Does Consistency help with RBM training?

no code implementations6 Oct 2015 Oswin Krause, Asja Fischer, Christian Igel

Compared to CD, it leads to a consistent estimate and may have a significantly lower bias.

Cannot find the paper you are looking for? You can Submit a new open access paper.