no code implementations • 5 Mar 2024 • Intekhab Hossain, Jonas Fischer, Rebekka Burkholz, John Quackenbush
Neural structure learning is of paramount importance for scientific discovery and interpretability.
no code implementations • 7 Dec 2023 • Nils Philipp Walter, Jonas Fischer, Jilles Vreeken
Discovering patterns in data that best describe the differences between classes allows to hypothesize and reason about class-specific mechanisms.
no code implementations • 18 Nov 2023 • Michael A. Hedderich, Jonas Fischer, Dietrich Klakow, Jilles Vreeken
Characterizing these errors in easily interpretable terms gives insight into whether a classifier is prone to making systematic errors, but also gives a way to act and improve the classifier.
no code implementations • 31 Jan 2023 • Jonas Fischer, Rebekka Burkholz, Jilles Vreeken
We show, however, that these methods fail to reconstruct local properties, such as relative differences in densities (Fig.
1 code implementation • ICLR 2022 • Jonas Fischer, Rebekka Burkholz
The lottery ticket hypothesis has sparked the rapid development of pruning algorithms that aim to reduce the computational costs associated with deep learning during training and model deployment.
no code implementations • 21 Oct 2021 • Jonas Fischer, Advait Gadhikar, Rebekka Burkholz
The strong lottery ticket hypothesis holds the promise that pruning randomly initialized deep neural networks could offer a computationally efficient alternative to deep learning with stochastic gradient descent.
2 code implementations • 18 Oct 2021 • Michael Hedderich, Jonas Fischer, Dietrich Klakow, Jilles Vreeken
Characterizing these errors in easily interpretable terms gives insight into whether a classifier is prone to making systematic errors, but also gives a way to act and improve the classifier.
1 code implementation • 7 Oct 2021 • Michael Kamp, Jonas Fischer, Jilles Vreeken
Federated learning allows multiple parties to collaboratively train a joint model without sharing local data.
no code implementations • 29 Sep 2021 • Michael Kamp, Jonas Fischer, Jilles Vreeken
Federated learning allows multiple parties to collaboratively train a joint model without sharing local data.
no code implementations • 2 Mar 2021 • Edith Heiter, Jonas Fischer, Jilles Vreeken
Low-dimensional embedding techniques such as tSNE and UMAP allow visualizing high-dimensional data and therewith facilitate the discovery of interesting structure.
no code implementations • 1 Jan 2021 • Jonas Fischer, Anna Oláh, Jilles Vreeken
In particular, we consider activation values of a network for given data, and propose to mine noise-robust rules of the form $X \rightarrow Y$ , where $X$ and $Y$ are sets of neurons in different layers.