Search Results for author: Robert A. Vandermeulen

Found 20 papers, 9 papers with code

Set Learning for Accurate and Calibrated Models

1 code implementation5 Jul 2023 Lukas Muttenthaler, Robert A. Vandermeulen, Qiuyi Zhang, Thomas Unterthiner, Klaus-Robert Müller

Model overconfidence and poor calibration are common in machine learning and difficult to account for when applying standard empirical risk minimization.

Sample Complexity Using Infinite Multiview Models

no code implementations8 Feb 2023 Robert A. Vandermeulen

Recent works have demonstrated that the convergence rate of a nonparametric density estimator can be greatly improved by using a low-rank estimator when the target density is a convex combination of separable probability densities with Lipschitz continuous marginals, i. e. a multiview model.

Human alignment of neural network representations

1 code implementation2 Nov 2022 Lukas Muttenthaler, Jonas Dippel, Lorenz Linhardt, Robert A. Vandermeulen, Simon Kornblith

Linear transformations of neural network representations learned from behavioral responses from one dataset substantially improve alignment with human similarity judgments on the other two datasets.

Odd One Out

Generalized Identifiability Bounds for Mixture Models with Grouped Samples

no code implementations22 Jul 2022 Robert A. Vandermeulen, René Saitenmacher

Recent work has shown that finite mixture models with $m$ components are identifiable, while making no assumptions on the mixture components, so long as one has access to groups of samples of size $2m-1$ which are known to come from the same mixture component.

Exposing Outlier Exposure: What Can Be Learned From Few, One, and Zero Outlier Images

1 code implementation23 May 2022 Philipp Liznerski, Lukas Ruff, Robert A. Vandermeulen, Billy Joe Franks, Klaus-Robert Müller, Marius Kloft

We find that standard classifiers and semi-supervised one-class methods trained to discern between normal samples and relatively few random natural images are able to outperform the current state of the art on an established AD benchmark with ImageNet.

 Ranked #1 on Anomaly Detection on One-class CIFAR-10 (using extra training data)

Anomaly Detection

VICE: Variational Interpretable Concept Embeddings

1 code implementation2 May 2022 Lukas Muttenthaler, Charles Y. Zheng, Patrick McClure, Robert A. Vandermeulen, Martin N. Hebart, Francisco Pereira

This paper introduces Variational Interpretable Concept Embeddings (VICE), an approximate Bayesian method for embedding object concepts in a vector space using data collected from humans in a triplet odd-one-out task.

Experimental Design Object +3

Beyond Smoothness: Incorporating Low-Rank Analysis into Nonparametric Density Estimation

no code implementations NeurIPS 2021 Robert A. Vandermeulen, Antoine Ledent

In this paper we investigate the theoretical implications of incorporating a multi-view latent variable model, a type of low-rank model, into nonparametric density estimation.

Density Estimation

Learning Interpretable Concept Groups in CNNs

1 code implementation21 Sep 2021 Saurabh Varshneya, Antoine Ledent, Robert A. Vandermeulen, Yunwen Lei, Matthias Enders, Damian Borth, Marius Kloft

We propose a novel training methodology -- Concept Group Learning (CGL) -- that encourages training of interpretable CNN filters by partitioning filters in each layer into concept groups, each of which is trained to learn a single visual concept.

Improving Nonparametric Density Estimation with Tensor Decompositions

no code implementations6 Oct 2020 Robert A. Vandermeulen

One technique for avoiding this is to assume no dependence between features and that the data are sampled from a separable density.

Density Estimation

Deep Anomaly Detection by Residual Adaptation

no code implementations5 Oct 2020 Lucas Deecke, Lukas Ruff, Robert A. Vandermeulen, Hakan Bilen

Deep anomaly detection is a difficult task since, in high dimensions, it is hard to completely characterize a notion of "differentness" when given only examples of normality.

Anomaly Detection Disentanglement

A Unifying Review of Deep and Shallow Anomaly Detection

no code implementations24 Sep 2020 Lukas Ruff, Jacob R. Kauffmann, Robert A. Vandermeulen, Grégoire Montavon, Wojciech Samek, Marius Kloft, Thomas G. Dietterich, Klaus-Robert Müller

Deep learning approaches to anomaly detection have recently improved the state of the art in detection performance on complex datasets such as large collections of images or text.

One-Class Classification

Input Hessian Regularization of Neural Networks

no code implementations14 Sep 2020 Waleed Mustafa, Robert A. Vandermeulen, Marius Kloft

Regularizing the input gradient has shown to be effective in promoting the robustness of neural networks.

Adversarial Attack

Explainable Deep One-Class Classification

2 code implementations ICLR 2021 Philipp Liznerski, Lukas Ruff, Robert A. Vandermeulen, Billy Joe Franks, Marius Kloft, Klaus-Robert Müller

Deep one-class classification variants for anomaly detection learn a mapping that concentrates nominal samples in feature space causing anomalies to be mapped away.

Ranked #5 on Anomaly Detection on One-class ImageNet-30 (using extra training data)

Classification General Classification +2

Consistent Estimation of Identifiable Nonparametric Mixture Models from Grouped Observations

1 code implementation NeurIPS 2020 Alexander Ritchie, Robert A. Vandermeulen, Clayton Scott

Recent research has established sufficient conditions for finite mixture models to be identifiable from grouped observations.

Rethinking Assumptions in Deep Anomaly Detection

1 code implementation30 May 2020 Lukas Ruff, Robert A. Vandermeulen, Billy Joe Franks, Klaus-Robert Müller, Marius Kloft

Though anomaly detection (AD) can be viewed as a classification problem (nominal vs. anomalous) it is usually treated in an unsupervised manner since one typically does not have access to, or it is infeasible to utilize, a dataset that sufficiently characterizes what it means to be "anomalous."

Anomaly Detection

Machine Learning in Thermodynamics: Prediction of Activity Coefficients by Matrix Completion

no code implementations29 Jan 2020 Fabian Jirasek, Rodrigo A. S. Alves, Julie Damay, Robert A. Vandermeulen, Robert Bamler, Michael Bortz, Stephan Mandt, Marius Kloft, Hans Hasse

Activity coefficients, which are a measure of the non-ideality of liquid mixtures, are a key property in chemical engineering with relevance to modeling chemical and phase equilibria as well as transport processes.

BIG-bench Machine Learning Matrix Completion

An Operator Theoretic Approach to Nonparametric Mixture Models

no code implementations30 Jun 2016 Robert A. Vandermeulen, Clayton D. Scott

In this work, we make no distributional assumptions on the mixture components and instead assume that observations from the mixture model are grouped, such that observations in the same group are known to be drawn from the same mixture component.

On The Identifiability of Mixture Models from Grouped Samples

no code implementations23 Feb 2015 Robert A. Vandermeulen, Clayton D. Scott

In such models it is assumed that data are drawn from random probability measures, called mixture components, which are themselves drawn from a probability measure P over probability measures.

Robust Kernel Density Estimation by Scaling and Projection in Hilbert Space

no code implementations NeurIPS 2014 Robert A. Vandermeulen, Clayton D. Scott

As with other estimators, a robust version of the KDE is useful since sample contamination is a common issue with datasets.

Density Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.