Search Results for author: Sebastian Bach

Found 6 papers, 2 papers with code

Layer-wise Relevance Propagation for Neural Networks with Local Renormalization Layers

no code implementations4 Apr 2016 Alexander Binder, Grégoire Montavon, Sebastian Bach, Klaus-Robert Müller, Wojciech Samek

Layer-wise relevance propagation is a framework which allows to decompose the prediction of a deep neural network computed over a sample, e. g. an image, down to relevance scores for the single input dimensions of the sample such as subpixels of an image.

Controlling Explanatory Heatmap Resolution and Semantics via Decomposition Depth

no code implementations21 Mar 2016 Sebastian Bach, Alexander Binder, Klaus-Robert Müller, Wojciech Samek

We present an application of the Layer-wise Relevance Propagation (LRP) algorithm to state of the art deep convolutional neural networks and Fisher Vector classifiers to compare the image perception and prediction strategies of both classifiers with the use of visualized heatmaps.

Explaining NonLinear Classification Decisions with Deep Taylor Decomposition

4 code implementations8 Dec 2015 Grégoire Montavon, Sebastian Bach, Alexander Binder, Wojciech Samek, Klaus-Robert Müller

Although our focus is on image classification, the method is applicable to a broad set of input data, learning tasks and network architectures.

Action Recognition Classification +3

Evaluating the visualization of what a Deep Neural Network has learned

1 code implementation21 Sep 2015 Wojciech Samek, Alexander Binder, Grégoire Montavon, Sebastian Bach, Klaus-Robert Müller

Our main result is that the recently proposed Layer-wise Relevance Propagation (LRP) algorithm qualitatively and quantitatively provides a better explanation of what made a DNN arrive at a particular classification decision than the sensitivity-based approach or the deconvolution method.

Classification General Classification +3

Cannot find the paper you are looking for? You can Submit a new open access paper.