Search Results for author: Klaus-Robert Müller

Found 139 papers, 57 papers with code

Playing Pinball with non-invasive BCI

no code implementations NeurIPS 2008 Matthias Krauledat, Konrad Grzeska, Max Sagebaum, Benjamin Blankertz, Carmen Vidaurre, Klaus-Robert Müller, Michael Schröder

Compared to invasive Brain-Computer Interfaces (BCI), non-invasive BCI systems based on Electroencephalogram (EEG) signals have not been applied successfully for complex control tasks.

EEG

Subject independent EEG-based BCI decoding

no code implementations NeurIPS 2009 Siamac Fazli, Cristian Grozea, Marton Danoczy, Benjamin Blankertz, Florin Popescu, Klaus-Robert Müller

In the quest to make Brain Computer Interfacing (BCI) more usable, dry electrodes have emerged that get rid of the initial 30 minutes required for placing an electrode cap.

EEG

Efficient and Accurate Lp-Norm Multiple Kernel Learning

no code implementations NeurIPS 2009 Marius Kloft, Ulf Brefeld, Pavel Laskov, Klaus-Robert Müller, Alexander Zien, Sören Sonnenburg

Previous approaches to multiple kernel learning (MKL) promote sparse kernel combinations and hence support interpretability.

Layer-wise analysis of deep networks with Gaussian kernels

no code implementations NeurIPS 2010 Grégoire Montavon, Klaus-Robert Müller, Mikio L. Braun

Deep networks can potentially express a learning problem more efficiently than local learning machines.

Regression for sets of polynomial equations

no code implementations20 Oct 2011 Franz Johannes Király, Paul von Bünau, Jan Saputra Müller, Duncan Blythe, Frank Meinecke, Klaus-Robert Müller

We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type.

regression

Transferring Subspaces Between Subjects in Brain-Computer Interfacing

no code implementations18 Sep 2012 Wojciech Samek, Frank C. Meinecke, Klaus-Robert Müller

Compensating changes between a subjects' training and testing session in Brain Computer Interfacing (BCI) is challenging but of great importance for a robust BCI operation.

EEG Motor Imagery

Orbital-free Bond Breaking via Machine Learning

no code implementations7 Jun 2013 John C. Snyder, Matthias Rupp, Katja Hansen, Leo Blooston, Klaus-Robert Müller, Kieron Burke

Machine learning is used to approximate the kinetic energy of one dimensional diatomics as a functional of the electron density.

BIG-bench Machine Learning

Multiple Kernel Learning for Brain-Computer Interfacing

no code implementations22 Oct 2013 Wojciech Samek, Alexander Binder, Klaus-Robert Müller

Combining information from different sources is a common way to improve classification accuracy in Brain-Computer Interfacing (BCI).

General Classification

Robust Spatial Filtering with Beta Divergence

no code implementations NeurIPS 2013 Wojciech Samek, Duncan Blythe, Klaus-Robert Müller, Motoaki Kawanabe

The efficiency of Brain-Computer Interfaces (BCI) largely depends upon a reliable extraction of informative features from the high-dimensional EEG signal.

EEG Motor Imagery

Generalizing Analytic Shrinkage for Arbitrary Covariance Structures

no code implementations NeurIPS 2013 Daniel Bartz, Klaus-Robert Müller

Analytic shrinkage is a statistical technique that offers a fast alternative to cross-validation for the regularization of covariance matrices and has appealing consistency properties.

Optical Character Recognition Optical Character Recognition (OCR)

Understanding Machine-learned Density Functionals

no code implementations4 Apr 2014 Li Li, John C. Snyder, Isabelle M. Pelaschier, Jessica Huang, Uma-Naresh Niranjan, Paul Duncan, Matthias Rupp, Klaus-Robert Müller, Kieron Burke

Kernel ridge regression is used to approximate the kinetic energy of non-interacting fermions in a one-dimensional box as a functional of their density.

regression Total Energy

Learning with Algebraic Invariances, and the Invariant Kernel Trick

no code implementations28 Nov 2014 Franz J. Király, Andreas Ziehe, Klaus-Robert Müller

When solving data analysis problems it is important to integrate prior knowledge and/or structural invariances.

Clustering

Multi-Target Shrinkage

no code implementations5 Dec 2014 Daniel Bartz, Johannes Höhne, Klaus-Robert Müller

For the sample mean and the sample covariance as specific instances, we derive conditions under which the optimality of MTS is applicable.

Wasserstein Training of Boltzmann Machines

no code implementations7 Jul 2015 Grégoire Montavon, Klaus-Robert Müller, Marco Cuturi

The Boltzmann machine provides a useful framework to learn highly complex, multimodal and multiscale data distributions that occur in the real world.

Denoising

Evaluating the visualization of what a Deep Neural Network has learned

1 code implementation21 Sep 2015 Wojciech Samek, Alexander Binder, Grégoire Montavon, Sebastian Bach, Klaus-Robert Müller

Our main result is that the recently proposed Layer-wise Relevance Propagation (LRP) algorithm qualitatively and quantitatively provides a better explanation of what made a DNN arrive at a particular classification decision than the sensitivity-based approach or the deconvolution method.

Classification General Classification +3

Validity of time reversal for testing Granger causality

no code implementations25 Sep 2015 Irene Winkler, Danny Panknin, Daniel Bartz, Klaus-Robert Müller, Stefan Haufe

Inferring causal interactions from observed data is a challenging problem, especially in the presence of measurement noise.

valid

Explaining NonLinear Classification Decisions with Deep Taylor Decomposition

4 code implementations8 Dec 2015 Grégoire Montavon, Sebastian Bach, Alexander Binder, Wojciech Samek, Klaus-Robert Müller

Although our focus is on image classification, the method is applicable to a broad set of input data, learning tasks and network architectures.

Action Recognition Classification +3

Controlling Explanatory Heatmap Resolution and Semantics via Decomposition Depth

no code implementations21 Mar 2016 Sebastian Bach, Alexander Binder, Klaus-Robert Müller, Wojciech Samek

We present an application of the Layer-wise Relevance Propagation (LRP) algorithm to state of the art deep convolutional neural networks and Fisher Vector classifiers to compare the image perception and prediction strategies of both classifiers with the use of visualized heatmaps.

Layer-wise Relevance Propagation for Neural Networks with Local Renormalization Layers

no code implementations4 Apr 2016 Alexander Binder, Grégoire Montavon, Sebastian Bach, Klaus-Robert Müller, Wojciech Samek

Layer-wise relevance propagation is a framework which allows to decompose the prediction of a deep neural network computed over a sample, e. g. an image, down to relevance scores for the single input dimensions of the sample such as subpixels of an image.

Explaining Predictions of Non-Linear Classifiers in NLP

1 code implementation WS 2016 Leila Arras, Franziska Horn, Grégoire Montavon, Klaus-Robert Müller, Wojciech Samek

Layer-wise relevance propagation (LRP) is a recently proposed technique for explaining predictions of complex non-linear classifiers in terms of input variables.

General Classification Image Classification

Identifying individual facial expressions by deconstructing a neural network

no code implementations23 Jun 2016 Farhad Arbabzadah, Grégoire Montavon, Klaus-Robert Müller, Wojciech Samek

We further observe that the explanation method provides important insights into the nature of features of the base model, which allow one to assess the aptitude of the base model for a given transfer learning task.

Attribute Gender Prediction +1

Object Boundary Detection and Classification with Image-level Labels

no code implementations29 Jun 2016 Jing Yu Koh, Wojciech Samek, Klaus-Robert Müller, Alexander Binder

We propose a novel strategy for solving this task, when pixel-level annotations are not available, performing it in an almost zero-shot manner by relying on conventional whole image neural net classifiers that were trained using large bounding boxes.

Boundary Detection Classification +3

Language Detection For Short Text Messages In Social Media

no code implementations30 Aug 2016 Ivana Balazevic, Mikio Braun, Klaus-Robert Müller

These approaches include the use of the well-known classifiers such as SVM and logistic regression, a dictionary based approach, and a probabilistic model based on modified Kneser-Ney smoothing.

By-passing the Kohn-Sham equations with machine learning

2 code implementations9 Sep 2016 Felix Brockherde, Leslie Vogt, Li Li, Mark E. Tuckerman, Kieron Burke, Klaus-Robert Müller

Last year, at least 30, 000 scientific papers used the Kohn-Sham scheme of density functional theory to solve electronic structure problems in a wide variety of scientific fields, ranging from materials science to biochemistry to astrophysics.

BIG-bench Machine Learning

Feature Importance Measure for Non-linear Learning Algorithms

1 code implementation22 Nov 2016 Marina M. -C. Vidovic, Nico Görnitz, Klaus-Robert Müller, Marius Kloft

MFI is general and can be applied to any arbitrary learning machine (including kernel machines and deep learning).

Feature Importance

Interpreting the Predictions of Complex ML Models by Layer-wise Relevance Propagation

no code implementations24 Nov 2016 Wojciech Samek, Grégoire Montavon, Alexander Binder, Sebastian Lapuschkin, Klaus-Robert Müller

Complex nonlinear models such as deep neural network (DNNs) have become an important tool for image classification, speech recognition, natural language processing, and many other fields of application.

General Classification Image Classification +2

Wasserstein Training of Restricted Boltzmann Machines

no code implementations NeurIPS 2016 Grégoire Montavon, Klaus-Robert Müller, Marco Cuturi

This metric between observations can then be used to define the Wasserstein distance between the distribution induced by the Boltzmann machine on the one hand, and that given by the training sample on the other hand.

Denoising

Predicting Pairwise Relations with Neural Similarity Encoders

1 code implementation6 Feb 2017 Franziska Horn, Klaus-Robert Müller

Matrix factorization is at the heart of many machine learning algorithms, for example, dimensionality reduction (e. g. kernel PCA) or recommender systems relying on collaborative filtering.

Collaborative Filtering Dimensionality Reduction +1

Explaining Recurrent Neural Network Predictions in Sentiment Analysis

1 code implementation WS 2017 Leila Arras, Grégoire Montavon, Klaus-Robert Müller, Wojciech Samek

Recently, a technique called Layer-wise Relevance Propagation (LRP) was shown to deliver insightful explanations in the form of input space relevances for understanding feed-forward neural network classification decisions.

General Classification Interpretable Machine Learning +1

Methods for Interpreting and Understanding Deep Neural Networks

no code implementations24 Jun 2017 Grégoire Montavon, Wojciech Samek, Klaus-Robert Müller

This paper provides an entry point to the problem of interpreting a deep neural network model and explaining its predictions.

SchNet: A continuous-filter convolutional neural network for modeling quantum interactions

5 code implementations NeurIPS 2017 Kristof T. Schütt, Pieter-Jan Kindermans, Huziel E. Sauceda, Stefan Chmiela, Alexandre Tkatchenko, Klaus-Robert Müller

Deep learning has the potential to revolutionize quantum chemistry as it is ideally suited to learn representations for structured data and speed up the exploration of chemical space.

 Ranked #1 on Time Series on QM9

Formation Energy Time Series +1

Exploring text datasets by visualizing relevant words

2 code implementations17 Jul 2017 Franziska Horn, Leila Arras, Grégoire Montavon, Klaus-Robert Müller, Wojciech Samek

When working with a new dataset, it is important to first explore and familiarize oneself with it, before applying any advanced machine learning algorithms.

Discovering topics in text datasets by visualizing relevant words

1 code implementation18 Jul 2017 Franziska Horn, Leila Arras, Grégoire Montavon, Klaus-Robert Müller, Wojciech Samek

When dealing with large collections of documents, it is imperative to quickly get an overview of the texts' contents.

Clustering

Minimizing Trust Leaks for Robust Sybil Detection

no code implementations ICML 2017 János Höner, Shinichi Nakajima, Alexander Bauer, Klaus-Robert Müller, Nico Görnitz

Sybil detection is a crucial task to protect online social networks (OSNs) against intruders who try to manipulate automatic services provided by OSNs to their customers.

Explainable Artificial Intelligence: Understanding, Visualizing and Interpreting Deep Learning Models

no code implementations28 Aug 2017 Wojciech Samek, Thomas Wiegand, Klaus-Robert Müller

With the availability of large databases and recent improvements in deep learning methodology, the performance of AI systems is reaching or even exceeding the human level on an increasing number of complex tasks.

Explainable artificial intelligence General Classification +2

Optimizing for Measure of Performance in Max-Margin Parsing

no code implementations5 Sep 2017 Alexander Bauer, Shinichi Nakajima, Nico Görnitz, Klaus-Robert Müller

Many statistical learning problems in the area of natural language processing including sequence tagging, sequence segmentation and syntactic parsing has been successfully approached by means of structured prediction methods.

Constituency Parsing Structured Prediction

SchNet - a deep learning architecture for molecules and materials

5 code implementations J. Chem. Phys. 2017 Kristof T. Schütt, Huziel E. Sauceda, Pieter-Jan Kindermans, Alexandre Tkatchenko, Klaus-Robert Müller

Deep learning has led to a paradigm shift in artificial intelligence, including web, text and image search, speech recognition, as well as bioinformatics, with growing impact in chemical physics.

Formation Energy Chemical Physics Materials Science

Towards Exact Molecular Dynamics Simulations with Machine-Learned Force Fields

1 code implementation26 Feb 2018 Stefan Chmiela, Huziel E. Sauceda, Klaus-Robert Müller, Alexandre Tkatchenko

Molecular dynamics (MD) simulations employing classical force fields constitute the cornerstone of contemporary atomistic modeling in chemistry, biology, and materials science.

Chemical Physics

Towards Explaining Anomalies: A Deep Taylor Decomposition of One-Class Models

no code implementations16 May 2018 Jacob Kauffmann, Klaus-Robert Müller, Grégoire Montavon

The proposed One-Class DTD is applicable to a number of common distance-based SVM kernels and is able to reliably explain a wide set of data anomalies.

Edge Detection

Sparse Binary Compression: Towards Distributed Deep Learning with minimal Communication

no code implementations22 May 2018 Felix Sattler, Simon Wiedemann, Klaus-Robert Müller, Wojciech Samek

A major issue in distributed training is the limited communication bandwidth between contributing nodes or prohibitive communication cost in general.

Binarization

Compact and Computationally Efficient Representation of Deep Neural Networks

no code implementations27 May 2018 Simon Wiedemann, Klaus-Robert Müller, Wojciech Samek

These new matrix formats have the novel property that their memory and algorithmic complexity are implicitly bounded by the entropy of the matrix, consequently implying that they are guaranteed to become more efficient as the entropy of the matrix is being reduced.

Understanding Patch-Based Learning by Explaining Predictions

no code implementations11 Jun 2018 Christopher Anders, Grégoire Montavon, Wojciech Samek, Klaus-Robert Müller

We apply the deep Taylor / LRP technique to understand the deep network's classification decisions, and identify a "border effect": a tendency of the classifier to look mainly at the bordering frames of the input.

General Classification

Quantum-chemical insights from interpretable atomistic neural networks

no code implementations27 Jun 2018 Kristof T. Schütt, Michael Gastegger, Alexandre Tkatchenko, Klaus-Robert Müller

With the rise of deep neural networks for quantum chemistry applications, there is a pressing need for architectures that, beyond delivering accurate predictions of chemical properties, are readily interpretable by researchers.

iNNvestigate neural networks!

1 code implementation13 Aug 2018 Maximilian Alber, Sebastian Lapuschkin, Philipp Seegerer, Miriam Hägele, Kristof T. Schütt, Grégoire Montavon, Wojciech Samek, Klaus-Robert Müller, Sven Dähne, Pieter-Jan Kindermans

The presented library iNNvestigate addresses this by providing a common interface and out-of-the- box implementation for many analysis methods, including the reference implementation for PatternNet and PatternAttribution as well as for LRP-methods.

Interpretable Machine Learning

Explaining the Unique Nature of Individual Gait Patterns with Deep Learning

1 code implementation13 Aug 2018 Fabian Horst, Sebastian Lapuschkin, Wojciech Samek, Klaus-Robert Müller, Wolfgang I. Schöllhorn

Machine learning (ML) techniques such as (deep) artificial neural networks (DNN) are solving very successfully a plethora of tasks and provide new predictive models for complex physical, chemical, biological and social systems.

Analyzing Neuroimaging Data Through Recurrent Deep Learning Models

1 code implementation23 Oct 2018 Armin W. Thomas, Hauke R. Heekeren, Klaus-Robert Müller, Wojciech Samek

We further demonstrate DeepLight's ability to study the fine-grained temporo-spatial variability of brain activity over sequences of single fMRI samples.

Learning representations of molecules and materials with atomistic neural networks

no code implementations11 Dec 2018 Kristof T. Schütt, Alexandre Tkatchenko, Klaus-Robert Müller

Deep Learning has been shown to learn efficient representations for structured data such as image, text or audio.

sGDML: Constructing Accurate and Data Efficient Molecular Force Fields Using Machine Learning

1 code implementation12 Dec 2018 Stefan Chmiela, Huziel E. Sauceda, Igor Poltavsky, Klaus-Robert Müller, Alexandre Tkatchenko

We present an optimized implementation of the recently proposed symmetric gradient domain machine learning (sGDML) model.

Computational Physics

Entropy-Constrained Training of Deep Neural Networks

no code implementations18 Dec 2018 Simon Wiedemann, Arturo Marban, Klaus-Robert Müller, Wojciech Samek

We propose a general framework for neural network compression that is motivated by the Minimum Description Length (MDL) principle.

Neural Network Compression

Automating the search for a patent's prior art with a full text similarity search

1 code implementation10 Jan 2019 Lea Helmers, Franziska Horn, Franziska Biegler, Tim Oppermann, Klaus-Robert Müller

The evaluation results show that our automated approach, besides accelerating the search process, also improves the search results for prior art with respect to their quality.

text similarity

Molecular Force Fields with Gradient-Domain Machine Learning: Construction and Application to Dynamics of Small Molecules with Coupled Cluster Forces

1 code implementation19 Jan 2019 Huziel E. Sauceda, Stefan Chmiela, Igor Poltavsky, Klaus-Robert Müller, Alexandre Tkatchenko

The analysis of sGDML molecular dynamics trajectories yields new qualitative insights into dynamics and spectroscopy of small molecules close to spectroscopic accuracy.

Chemical Physics Computational Physics Data Analysis, Statistics and Probability

Unmasking Clever Hans Predictors and Assessing What Machines Really Learn

1 code implementation26 Feb 2019 Sebastian Lapuschkin, Stephan Wäldchen, Alexander Binder, Grégoire Montavon, Wojciech Samek, Klaus-Robert Müller

Current learning machines have successfully solved hard application problems, reaching high accuracy and displaying seemingly "intelligent" behavior.

Local Function Complexity for Active Learning via Mixture of Gaussian Processes

no code implementations27 Feb 2019 Danny Panknin, Stefan Chmiela, Klaus-Robert Müller, Shinichi Nakajima

Inhomogeneities in real-world data, e. g., due to changes in the observation noise level or variations in the structural complexity of the source function, pose a unique set of challenges for statistical inference.

Active Learning GPR +1

Robust and Communication-Efficient Federated Learning from Non-IID Data

1 code implementation7 Mar 2019 Felix Sattler, Simon Wiedemann, Klaus-Robert Müller, Wojciech Samek

Federated Learning allows multiple parties to jointly train a deep learning model on their combined data, without any of the participants having to reveal their local data to a centralized server.

Federated Learning Privacy Preserving

Comment on "Solving Statistical Mechanics Using VANs": Introducing saVANt - VANs Enhanced by Importance and MCMC Sampling

no code implementations26 Mar 2019 Kim Nicoli, Pan Kessel, Nils Strodthoff, Wojciech Samek, Klaus-Robert Müller, Shinichi Nakajima

In this comment on "Solving Statistical Mechanics Using Variational Autoregressive Networks" by Wu et al., we propose a subtle yet powerful modification of their approach.

Black-Box Decision based Adversarial Attack with Symmetric $α$-stable Distribution

no code implementations11 Apr 2019 Vignesh Srinivasan, Ercan E. Kuruoglu, Klaus-Robert Müller, Wojciech Samek, Shinichi Nakajima

Many existing methods employ Gaussian random variables for exploring the data space to find the most adversarial (for attacking) or least adversarial (for defense) point.

Adversarial Attack

Evaluating Recurrent Neural Network Explanations

1 code implementation WS 2019 Leila Arras, Ahmed Osman, Klaus-Robert Müller, Wojciech Samek

Recently, several methods have been proposed to explain the predictions of recurrent neural networks (RNNs), in particular of LSTMs.

Negation Sentence +1

Deep Transfer Learning For Whole-Brain fMRI Analyses

no code implementations2 Jul 2019 Armin W. Thomas, Klaus-Robert Müller, Wojciech Samek

Even further, the pre-trained DL model variant is already able to correctly decode 67. 51% of the cognitive states from a test dataset with 100 individuals, when fine-tuned on a dataset of the size of only three subjects.

Transfer Learning

Explaining and Interpreting LSTMs

no code implementations25 Sep 2019 Leila Arras, Jose A. Arjona-Medina, Michael Widrich, Grégoire Montavon, Michael Gillhofer, Klaus-Robert Müller, Sepp Hochreiter, Wojciech Samek

While neural networks have acted as a strong unifying force in the design of modern AI systems, the neural network architectures themselves remain highly heterogeneous due to the variety of tasks to be solved.

Clustered Federated Learning: Model-Agnostic Distributed Multi-Task Optimization under Privacy Constraints

2 code implementations4 Oct 2019 Felix Sattler, Klaus-Robert Müller, Wojciech Samek

Federated Learning (FL) is currently the most widely adopted framework for collaborative training of (deep) machine learning models under privacy constraints.

Clustering Federated Learning +2

Pruning by Explaining: A Novel Criterion for Deep Neural Network Pruning

1 code implementation18 Dec 2019 Seul-Ki Yeom, Philipp Seegerer, Sebastian Lapuschkin, Alexander Binder, Simon Wiedemann, Klaus-Robert Müller, Wojciech Samek

The success of convolutional neural networks (CNNs) in various applications is accompanied by a significant increase in computation and parameter storage costs.

Explainable Artificial Intelligence (XAI) Model Compression +2

Finding and Removing Clever Hans: Using Explanation Methods to Debug and Improve Deep Models

2 code implementations22 Dec 2019 Christopher J. Anders, Leander Weber, David Neumann, Wojciech Samek, Klaus-Robert Müller, Sebastian Lapuschkin

Based on a recent technique - Spectral Relevance Analysis - we propose the following technical contributions and resulting findings: (a) a scalable quantification of artifactual and poisoned classes where the machine learning models under study exhibit CH behavior, (b) several approaches denoted as Class Artifact Compensation (ClArC), which are able to effectively and significantly reduce a model's CH behavior.

Forecasting Industrial Aging Processes with Machine Learning Methods

no code implementations5 Feb 2020 Mihail Bogojeski, Simeon Sauer, Franziska Horn, Klaus-Robert Müller

Accurately predicting industrial aging processes makes it possible to schedule maintenance events further in advance, ensuring a cost-efficient and reliable operation of the plant.

BIG-bench Machine Learning

Autonomous robotic nanofabrication with reinforcement learning

1 code implementation27 Feb 2020 Philipp Leinen, Malte Esders, Kristof T. Schütt, Christian Wagner, Klaus-Robert Müller, F. Stefan Tautz

Here, we present a strategy to work around both obstacles, and demonstrate autonomous robotic nanofabrication by manipulating single molecules.

reinforcement-learning Reinforcement Learning (RL)

Building and Interpreting Deep Similarity Models

1 code implementation11 Mar 2020 Oliver Eberle, Jochen Büttner, Florian Kräutli, Klaus-Robert Müller, Matteo Valleriani, Grégoire Montavon

Many learning algorithms such as kernel machines, nearest neighbors, clustering, or anomaly detection, are based on the concept of 'distance' or 'similarity'.

Anomaly Detection Clustering

Automatic Identification of Types of Alterations in Historical Manuscripts

no code implementations20 Mar 2020 David Lassner, Anne Baillot, Sergej Dogadov, Klaus-Robert Müller, Shinichi Nakajima

In addition to the findings based on the digital scholarly edition Berlin Intellectuals, we present a general framework for the analysis of text genesis that can be used in the context of other digital resources representing document variants.

BIG-bench Machine Learning

Risk Estimation of SARS-CoV-2 Transmission from Bluetooth Low Energy Measurements

no code implementations22 Apr 2020 Felix Sattler, Jackie Ma, Patrick Wagner, David Neumann, Markus Wenzel, Ralf Schäfer, Wojciech Samek, Klaus-Robert Müller, Thomas Wiegand

Digital contact tracing approaches based on Bluetooth low energy (BLE) have the potential to efficiently contain and delay outbreaks of infectious diseases such as the ongoing SARS-CoV-2 pandemic.

BIG-bench Machine Learning

Ensemble Learning of Coarse-Grained Molecular Dynamics Force Fields with a Kernel Approach

no code implementations4 May 2020 Jiang Wang, Stefan Chmiela, Klaus-Robert Müller, Frank Noè, Cecilia Clementi

Using ensemble learning and stratified sampling, we propose a 2-layer training scheme that enables GDML to learn an effective coarse-grained model.

Ensemble Learning

Rethinking Assumptions in Deep Anomaly Detection

1 code implementation30 May 2020 Lukas Ruff, Robert A. Vandermeulen, Billy Joe Franks, Klaus-Robert Müller, Marius Kloft

Though anomaly detection (AD) can be viewed as a classification problem (nominal vs. anomalous) it is usually treated in an unsupervised manner since one typically does not have access to, or it is infeasible to utilize, a dataset that sufficiently characterizes what it means to be "anomalous."

Anomaly Detection

Higher-Order Explanations of Graph Neural Networks via Relevant Walks

no code implementations5 Jun 2020 Thomas Schnake, Oliver Eberle, Jonas Lederer, Shinichi Nakajima, Kristof T. Schütt, Klaus-Robert Müller, Grégoire Montavon

In this paper, we show that GNNs can in fact be naturally explained using higher-order expansions, i. e. by identifying groups of edges that jointly contribute to the prediction.

Image Classification Sentiment Analysis

How Much Can I Trust You? -- Quantifying Uncertainties in Explaining Neural Networks

1 code implementation16 Jun 2020 Kirill Bykov, Marina M. -C. Höhne, Klaus-Robert Müller, Shinichi Nakajima, Marius Kloft

Explainable AI (XAI) aims to provide interpretations for predictions made by learning machines, such as deep neural networks, in order to make the machines more transparent for the user and furthermore trustworthy also for applications in e. g. safety-critical areas.

Explainable Artificial Intelligence (XAI)

The Clever Hans Effect in Anomaly Detection

no code implementations18 Jun 2020 Jacob Kauffmann, Lukas Ruff, Grégoire Montavon, Klaus-Robert Müller

The 'Clever Hans' effect occurs when the learned model produces correct predictions based on the 'wrong' features.

Anomaly Detection Explainable Artificial Intelligence (XAI) +1

Explainable Deep One-Class Classification

2 code implementations ICLR 2021 Philipp Liznerski, Lukas Ruff, Robert A. Vandermeulen, Billy Joe Franks, Marius Kloft, Klaus-Robert Müller

Deep one-class classification variants for anomaly detection learn a mapping that concentrates nominal samples in feature space causing anomalies to be mapped away.

Ranked #5 on Anomaly Detection on One-class ImageNet-30 (using extra training data)

Classification General Classification +2

Langevin Cooling for Domain Translation

1 code implementation31 Aug 2020 Vignesh Srinivasan, Klaus-Robert Müller, Wojciech Samek, Shinichi Nakajima

Domain translation is the task of finding correspondence between two domains.

Translation

A Unifying Review of Deep and Shallow Anomaly Detection

no code implementations24 Sep 2020 Lukas Ruff, Jacob R. Kauffmann, Robert A. Vandermeulen, Grégoire Montavon, Wojciech Samek, Marius Kloft, Thomas G. Dietterich, Klaus-Robert Müller

Deep learning approaches to anomaly detection have recently improved the state of the art in detection performance on complex datasets such as large collections of images or text.

One-Class Classification

Machine Learning Force Fields

no code implementations14 Oct 2020 Oliver T. Unke, Stefan Chmiela, Huziel E. Sauceda, Michael Gastegger, Igor Poltavsky, Kristof T. Schütt, Alexandre Tkatchenko, Klaus-Robert Müller

In recent years, the use of Machine Learning (ML) in computational chemistry has enabled numerous advances previously out of reach due to the computational complexity of traditional electronic-structure methods.

BIG-bench Machine Learning

Machine learning of solvent effects on molecular spectra and reactions

1 code implementation28 Oct 2020 Michael Gastegger, Kristof T. Schütt, Klaus-Robert Müller

We employ FieldSchNet to study the influence of solvent effects on molecular spectra and a Claisen rearrangement reaction.

BIG-bench Machine Learning

Towards Robust Explanations for Deep Neural Networks

no code implementations18 Dec 2020 Ann-Kathrin Dombrowski, Christopher J. Anders, Klaus-Robert Müller, Pan Kessel

Explanation methods shed light on the decision process of black-box classifiers such as deep neural networks.

SpookyNet: Learning Force Fields with Electronic Degrees of Freedom and Nonlocal Effects

no code implementations1 May 2021 Oliver T. Unke, Stefan Chmiela, Michael Gastegger, Kristof T. Schütt, Huziel E. Sauceda, Klaus-Robert Müller

Machine-learned force fields (ML-FFs) combine the accuracy of ab initio methods with the efficiency of conventional force fields.

SE(3)-equivariant prediction of molecular wavefunctions and electronic densities

no code implementations NeurIPS 2021 Oliver T. Unke, Mihail Bogojeski, Michael Gastegger, Mario Geiger, Tess Smidt, Klaus-Robert Müller

Machine learning has enabled the prediction of quantum chemical properties with high accuracy and efficiency, allowing to bypass computationally costly ab initio calculations.

Transfer Learning

BIGDML: Towards Exact Machine Learning Force Fields for Materials

no code implementations8 Jun 2021 Huziel E. Sauceda, Luis E. Gálvez-González, Stefan Chmiela, Lauro Oliver Paz-Borbón, Klaus-Robert Müller, Alexandre Tkatchenko

Machine-learning force fields (MLFF) should be accurate, computationally and data efficient, and applicable to molecules, materials, and interfaces thereof.

BIG-bench Machine Learning

Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization

1 code implementation9 Jun 2021 Léo Andeol, Yusei Kawakami, Yuichiro Wada, Takafumi Kanamori, Klaus-Robert Müller, Grégoire Montavon

However, common ML losses do not give strong guarantees on how consistently the ML model performs for different domains, in particular, whether the model performs well on a domain at the expense of its performance on another domain.

On the Robustness of Pretraining and Self-Supervision for a Deep Learning-based Analysis of Diabetic Retinopathy

no code implementations25 Jun 2021 Vignesh Srinivasan, Nils Strodthoff, Jackie Ma, Alexander Binder, Klaus-Robert Müller, Wojciech Samek

Our results indicate that models initialized from ImageNet pretraining report a significant increase in performance, generalization and robustness to image distortions.

Contrastive Learning Diabetic Retinopathy Grading

Explaining Bayesian Neural Networks

no code implementations23 Aug 2021 Kirill Bykov, Marina M. -C. Höhne, Adelaida Creosteanu, Klaus-Robert Müller, Frederick Klauschen, Shinichi Nakajima, Marius Kloft

Bayesian approaches such as Bayesian Neural Networks (BNNs) so far have a limited form of transparency (model transparency) already built-in through their prior weight distribution, but notably, they lack explanations of their predictions for given instances.

Decision Making Explainable Artificial Intelligence (XAI)

Evaluating deep transfer learning for whole-brain cognitive decoding

1 code implementation1 Nov 2021 Armin W. Thomas, Ulman Lindenberger, Wojciech Samek, Klaus-Robert Müller

Here, we systematically evaluate TL for the application of DL models to the decoding of cognitive states (e. g., viewing images of faces or houses) from whole-brain functional Magnetic Resonance Imaging (fMRI) data.

Transfer Learning

Scrutinizing XAI using linear ground-truth data with suppressor variables

1 code implementation14 Nov 2021 Rick Wilming, Céline Budding, Klaus-Robert Müller, Stefan Haufe

It has been demonstrated that some saliency methods can highlight features that have no statistical association with the prediction target (suppressor variables).

Explainable Artificial Intelligence (XAI) Feature Importance

Toward Explainable AI for Regression Models

1 code implementation21 Dec 2021 Simon Letzgus, Patrick Wagner, Jonas Lederer, Wojciech Samek, Klaus-Robert Müller, Gregoire Montavon

In addition to the impressive predictive power of machine learning (ML) models, more recently, explanation methods have emerged that enable an interpretation of complex non-linear learning models such as deep neural networks.

Explainable Artificial Intelligence (XAI) regression

Super-resolution in Molecular Dynamics Trajectory Reconstruction with Bi-Directional Neural Networks

no code implementations2 Jan 2022 Ludwig Winkler, Klaus-Robert Müller, Huziel E. Sauceda

Molecular dynamics simulations are a cornerstone in science, allowing to investigate from the system's thermodynamics to analyse intricate molecular interactions.

Super-Resolution

Automated Dissipation Control for Turbulence Simulation with Shell Models

no code implementations7 Jan 2022 Ann-Kathrin Dombrowski, Klaus-Robert Müller, Wolf Christian Müller

The application of machine learning (ML) techniques, especially neural networks, has seen tremendous success at processing images and language.

BIG-bench Machine Learning

Automatic Identification of Chemical Moieties

no code implementations30 Mar 2022 Jonas Lederer, Michael Gastegger, Kristof T. Schütt, Michael Kampffmeyer, Klaus-Robert Müller, Oliver T. Unke

In recent years, the prediction of quantum mechanical observables with machine learning methods has become increasingly popular.

Property Prediction

Exposing Outlier Exposure: What Can Be Learned From Few, One, and Zero Outlier Images

1 code implementation23 May 2022 Philipp Liznerski, Lukas Ruff, Robert A. Vandermeulen, Billy Joe Franks, Klaus-Robert Müller, Marius Kloft

We find that standard classifiers and semi-supervised one-class methods trained to discern between normal samples and relatively few random natural images are able to outperform the current state of the art on an established AD benchmark with ImageNet.

 Ranked #1 on Anomaly Detection on One-class CIFAR-10 (using extra training data)

Anomaly Detection

So3krates: Equivariant attention for interactions on arbitrary length-scales in molecular systems

1 code implementation28 May 2022 J. Thorben Frank, Oliver T. Unke, Klaus-Robert Müller

The application of machine learning methods in quantum chemistry has enabled the study of numerous chemical phenomena, which are computationally intractable with traditional ab-initio methods.

DORA: Exploring Outlier Representations in Deep Neural Networks

1 code implementation9 Jun 2022 Kirill Bykov, Mayukh Deb, Dennis Grinwald, Klaus-Robert Müller, Marina M. -C. Höhne

Deep Neural Networks (DNNs) excel at learning complex abstractions within their internal representations.

Decision Making

Diffeomorphic Counterfactuals with Generative Models

1 code implementation10 Jun 2022 Ann-Kathrin Dombrowski, Jan E. Gerken, Klaus-Robert Müller, Pan Kessel

Counterfactuals can explain classification decisions of neural networks in a human interpretable way.

Self-Supervised Training with Autoencoders for Visual Anomaly Detection

no code implementations23 Jun 2022 Alexander Bauer, Shinichi Nakajima, Klaus-Robert Müller

This insight makes the reconstruction error a natural choice for defining the anomaly score of a sample according to its distance from a corresponding projection on the data manifold.

Anomaly Detection Dimensionality Reduction +2

Algorithmic Differentiation for Automated Modeling of Machine Learned Force Fields

1 code implementation25 Aug 2022 Niklas Frederik Schmitz, Klaus-Robert Müller, Stefan Chmiela

Reconstructing force fields (FFs) from atomistic simulation data is a challenge since accurate data can be highly expensive.

Computational Efficiency

Shortcomings of Top-Down Randomization-Based Sanity Checks for Evaluations of Deep Neural Network Explanations

no code implementations CVPR 2023 Alexander Binder, Leander Weber, Sebastian Lapuschkin, Grégoire Montavon, Klaus-Robert Müller, Wojciech Samek

To address shortcomings of this test, we start by observing an experimental gap in the ranking of explanation methods between randomization-based sanity checks [1] and model output faithfulness measures (e. g. [25]).

Disentangled Explanations of Neural Network Predictions by Finding Relevant Subspaces

no code implementations30 Dec 2022 Pattarawat Chormai, Jan Herrmann, Klaus-Robert Müller, Grégoire Montavon

Explanations often take the form of a heatmap identifying input features (e. g. pixels) that are relevant to the model's decision.

Mark My Words: Dangers of Watermarked Images in ImageNet

no code implementations9 Mar 2023 Kirill Bykov, Klaus-Robert Müller, Marina M. -C. Höhne

The utilization of pre-trained networks, especially those trained on ImageNet, has become a common practice in Computer Vision.

Preemptively Pruning Clever-Hans Strategies in Deep Neural Networks

no code implementations12 Apr 2023 Lorenz Linhardt, Klaus-Robert Müller, Grégoire Montavon

In this paper, we demonstrate that acceptance of explanations by the user is not a guarantee for a machine learning model to be robust against Clever Hans effects, which may remain undetected.

An XAI framework for robust and transparent data-driven wind turbine power curve models

1 code implementation19 Apr 2023 Simon Letzgus, Klaus-Robert Müller

Alongside this paper, we publish a Python implementation of the presented framework and hope this can guide researchers and practitioners alike toward training, selecting and utilizing more transparent and robust data-driven wind turbine power curve models.

Explainable artificial intelligence Explainable Artificial Intelligence (XAI) +1

Set Learning for Accurate and Calibrated Models

1 code implementation5 Jul 2023 Lukas Muttenthaler, Robert A. Vandermeulen, Qiuyi Zhang, Thomas Unterthiner, Klaus-Robert Müller

Model overconfidence and poor calibration are common in machine learning and difficult to account for when applying standard empirical risk minimization.

From Peptides to Nanostructures: A Euclidean Transformer for Fast and Stable Machine Learned Force Fields

1 code implementation21 Sep 2023 J. Thorben Frank, Oliver T. Unke, Klaus-Robert Müller, Stefan Chmiela

Recent years have seen vast progress in the development of machine learned force fields (MLFFs) based on ab-initio reference calculations.

Insightful analysis of historical sources at scales beyond human capabilities using unsupervised Machine Learning and XAI

no code implementations13 Oct 2023 Oliver Eberle, Jochen Büttner, Hassan El-Hajj, Grégoire Montavon, Klaus-Robert Müller, Matteo Valleriani

An ML based analysis of these tables helps to unveil important facets of the spatio-temporal evolution of knowledge and innovation in the field of mathematical astronomy in the period, as taught at European universities.

Astronomy

Manipulating Feature Visualizations with Gradient Slingshots

1 code implementation11 Jan 2024 Dilyara Bareeva, Marina M. -C. Höhne, Alexander Warnecke, Lukas Pirch, Klaus-Robert Müller, Konrad Rieck, Kirill Bykov

Deep Neural Networks (DNNs) are capable of learning complex and versatile representations, however, the semantic nature of the learned concepts remains unknown.

Decision Making

XpertAI: uncovering model strategies for sub-manifolds

no code implementations12 Mar 2024 Simon Letzgus, Klaus-Robert Müller, Grégoire Montavon

In regression, explanations need to be precisely formulated to address specific user queries (e. g.\ distinguishing between `Why is the output above 0?'

regression

Molecular relaxation by reverse diffusion with time step prediction

1 code implementation16 Apr 2024 Khaled Kahouli, Stefaan Simon Pierre Hessmann, Klaus-Robert Müller, Shinichi Nakajima, Stefan Gugler, Niklas Wolf Andreas Gebauer

As a remedy, we propose MoreRed, molecular relaxation by reverse diffusion, a conceptually novel and purely statistical approach where non-equilibrium structures are treated as noisy instances of their corresponding equilibrium states.

Denoising

Cannot find the paper you are looking for? You can Submit a new open access paper.