1 code implementation • 6 Nov 2013 • Jan Melchior, Asja Fischer, Laurenz Wiskott
This work analyzes centered binary Restricted Boltzmann Machines (RBMs) and binary Deep Boltzmann Machines (DBMs), where centering is done by subtracting offset values from visible and hidden variables.
1 code implementation • 23 Dec 2014 • Dong-Hyun Lee, Saizheng Zhang, Asja Fischer, Yoshua Bengio
Back-propagation has been the workhorse of recent successes of deep learning but it relies on infinitesimal effects (partial derivatives) in order to perform credit assignment.
1 code implementation • 12 Jun 2015 • Jorg Bornschein, Samira Shabanian, Asja Fischer, Yoshua Bengio
We present a lower-bound for the likelihood of this model and we show that optimizing this bound regularizes the model so that the Bhattacharyya distance between the bottom-up and top-down approximate distributions is minimized.
no code implementations • 19 Sep 2015 • Yoshua Bengio, Thomas Mesnard, Asja Fischer, Saizheng Zhang, Yuhuai Wu
We introduce a weight update formula that is expressed only in terms of firing rates and their derivatives and that results in changes consistent with those associated with spike-timing dependent plasticity (STDP) rules and biological observations, even though the explicit timing of spikes is not needed.
no code implementations • 6 Oct 2015 • Oswin Krause, Asja Fischer, Christian Igel
Compared to CD, it leads to a consistent estimate and may have a significantly lower bias.
no code implementations • 9 Oct 2015 • Yoshua Bengio, Asja Fischer
We show that Langevin MCMC inference in an energy-based model with latent variables has the property that the early steps of inference, starting from a stationary point, correspond to propagating error gradients into internal layers, similarly to back-propagation.
no code implementations • 1 Feb 2016 • Björn Weghenkel, Asja Fischer, Laurenz Wiskott
We propose graph-based predictable feature analysis (GPFA), a new method for unsupervised learning of predictable features from high-dimensional time series, where high predictability is understood very generically as low variance in the distribution of the next data point given the previous ones.
2 code implementations • ICML 2017 • Devansh Arpit, Stanisław Jastrzębski, Nicolas Ballas, David Krueger, Emmanuel Bengio, Maxinder S. Kanwal, Tegan Maharaj, Asja Fischer, Aaron Courville, Yoshua Bengio, Simon Lacoste-Julien
We examine the role of memorization in deep learning, drawing connections to capacity, generalization, and adversarial robustness.
2 code implementations • ICLR 2018 • Henning Petzka, Asja Fischer, Denis Lukovnicov
Since their invention, generative adversarial networks (GANs) have become a popular approach for learning to model a distribution of real (unlabeled) data.
no code implementations • ICLR 2018 • Stanisław Jastrzębski, Zachary Kenton, Devansh Arpit, Nicolas Ballas, Asja Fischer, Yoshua Bengio, Amos Storkey
In particular we find that the ratio of learning rate to batch size is a key determinant of SGD dynamics and of the width of the final minima, and that higher values of the ratio lead to wider minima and often better generalization.
1 code implementation • 3 Feb 2018 • Agustinus Kristiadi, Mohammad Asif Khan, Denis Lukovnikov, Jens Lehmann, Asja Fischer
Most of the existing work on embedding (or latent feature) based knowledge graph analysis focuses mainly on the relations between entities.
1 code implementation • ICLR 2019 • Stanisław Jastrzębski, Zachary Kenton, Nicolas Ballas, Asja Fischer, Yoshua Bengio, Amos Storkey
When studying the SGD dynamics in relation to the sharpest directions in this initial phase, we find that the SGD step is large compared to the curvature and commonly fails to minimize the loss along the sharpest directions.
1 code implementation • CONLL 2018 • Debanjan Chaudhuri, Agustinus Kristiadi, Jens Lehmann, Asja Fischer
Building systems that can communicate with humans is a core problem in Artificial Intelligence.
1 code implementation • 2 Nov 2018 • Gaurav Maheshwari, Priyansh Trivedi, Denis Lukovnikov, Nilesh Chakraborty, Asja Fischer, Jens Lehmann
In this paper, we conduct an empirical investigation of neural query graph ranking approaches for the task of complex question answering over knowledge graphs.
no code implementations • 13 Nov 2018 • Denis Lukovnikov, Nilesh Chakraborty, Jens Lehmann, Asja Fischer
Translating natural language to SQL queries for table-based question answering is a challenging problem and has received significant attention from the research community.
no code implementations • 4 Feb 2019 • Agustinus Kristiadi, Sina Däubener, Asja Fischer
Despite the huge success of deep neural networks (NNs), finding good mechanisms for quantifying their prediction uncertainty is still an open problem.
no code implementations • ICLR 2019 • Agustinus Kristiadi, Asja Fischer
Despite the huge success of deep neural networks (NNs), finding good mechanisms for quantifying their prediction uncertainty is still an open problem.
no code implementations • 22 Jul 2019 • Nilesh Chakraborty, Denis Lukovnikov, Gaurav Maheshwari, Priyansh Trivedi, Jens Lehmann, Asja Fischer
Question answering has emerged as an intuitive way of querying structured data sources, and has attracted significant advancements over the years.
no code implementations • 25 Feb 2020 • Rostislav Nedelchev, Debanjan Chaudhuri, Jens Lehmann, Asja Fischer
Entity linking - connecting entity mentions in a natural language utterance to knowledge graph (KG) entities is a crucial step for question answering over KGs.
1 code implementation • ICML 2020 • Joel Frank, Thorsten Eisenhofer, Lea Schönherr, Asja Fischer, Dorothea Kolossa, Thorsten Holz
Based on this analysis, we demonstrate how the frequency representation can be used to identify deep fake images in an automated way, surpassing state-of-the-art methods.
1 code implementation • 1 May 2020 • Mike Laszkiewicz, Asja Fischer, Johannes Lederer
Many Machine Learning algorithms are formulated as regularized optimization problems, but their performance hinges on a regularization parameter that needs to be calibrated to each application at hand.
no code implementations • 15 May 2020 • Mohammad Asif Khan, Fabien Cardinaux, Stefan Uhlich, Marc Ferras, Asja Fischer
This procedure bears the problem that the generated magnitude spectrogram may not be consistent, which is required for finding a phase such that the full spectrogram has a natural-sounding speech waveform.
1 code implementation • 24 May 2020 • Sina Däubener, Lea Schönherr, Asja Fischer, Dorothea Kolossa
The neural networks for uncertainty quantification simultaneously diminish the vulnerability to the attack, which is reflected in a lower recognition accuracy of the malicious target text in comparison to a standard hybrid ASR system.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +3
2 code implementations • 23 Jun 2020 • Mehdi Ali, Max Berrendorf, Charles Tapley Hoyt, Laurent Vermue, Mikhail Galkin, Sahand Sharifzadeh, Asja Fischer, Volker Tresp, Jens Lehmann
The heterogeneity in recently published knowledge graph embedding models' implementations, training, and evaluation has made fair and thorough comparisons difficult.
no code implementations • 26 Jun 2020 • Kai Brügge, Asja Fischer, Christian Igel
We propose a modified Metropolis transition operator that behaves almost always identically to the standard Metropolis operator and prove that it ensures irreducibility and convergence to the limiting distribution in the multivariate binary case with fixed-order updates.
no code implementations • 10 Jul 2020 • Joachim Sicking, Maram Akila, Tim Wirtz, Sebastian Houben, Asja Fischer
Monte Carlo (MC) dropout is one of the state-of-the-art approaches for uncertainty estimation in neural networks (NNs).
no code implementations • 19 Jul 2020 • Denis Lukovnikov, Jens Lehmann, Asja Fischer
Many popular variants of graph neural networks (GNNs) that are capable of handling multi-relational graphs may suffer from vanishing gradients.
no code implementations • 7 Aug 2020 • Sina Däubener, Asja Fischer
Uncertainty quantification in neural networks gained a lot of attention in the past years.
1 code implementation • 28 Oct 2020 • Simon Damm, Dennis Forster, Dmytro Velychko, Zhenwen Dai, Asja Fischer, Jörg Lücke
Here we show that for standard (i. e., Gaussian) VAEs the ELBO converges to a value given by the sum of three entropies: the (negative) entropy of the prior distribution, the expected (negative) entropy of the observable distribution, and the average entropy of the variational distributions (the latter is already part of the ELBO).
no code implementations • pproximateinference AABI Symposium 2021 • Sina Däubener, Joel Frank, Thorsten Holz, Asja Fischer
In this paper we propose to efficiently attack Bayesian neural networks with adversarial examples calculated for a deterministic network with parameters given by the mean of the posterior distribution.
1 code implementation • 23 Dec 2020 • Joachim Sicking, Maram Akila, Maximilian Pintz, Tim Wirtz, Asja Fischer, Stefan Wrobel
Despite of its importance for safe machine learning, uncertainty quantification for neural networks is far from being solved.
no code implementations • 1 Jan 2021 • Arne Peter Raulf, Ben Luis Hack, Sina Däubener, Axel Mosig, Asja Fischer
With the excessive use of neural networks in safety critical domains the need for understandable explanations of their predictions is rising.
no code implementations • 1 Jan 2021 • Denis Lukovnikov, Asja Fischer
Relational Graph Neural Networks (GNN) are a class of GNN that are capable of handling multi-relational graphs.
no code implementations • pproximateinference AABI Symposium 2021 • Joachim Sicking, Maram Akila, Maximilian Pintz, Tim Wirtz, Asja Fischer, Stefan Wrobel
One of the most commonly used approaches so far is Monte Carlo dropout, which is computationally cheap and easy to apply in practice.
1 code implementation • ICML Workshop INNF 2021 • Mike Laszkiewicz, Johannes Lederer, Asja Fischer
Normalizing flows, which learn a distribution by transforming the data to samples from a Gaussian base distribution, have proven powerful density approximations.
no code implementations • 14 Dec 2021 • Matias Pizarro, Dorothea Kolossa, Asja Fischer
We perform an empirical analysis of hybrid ASR models trained on data pre-processed in such a way.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +1
no code implementations • 22 Apr 2022 • Sina Däubener, Asja Fischer
Stochastic neural networks (SNNs) are random functions whose predictions are gained by averaging over multiple realizations.
1 code implementation • 21 Jun 2022 • Mike Laszkiewicz, Johannes Lederer, Asja Fischer
Learning the tail behavior of a distribution is a notoriously difficult problem.
1 code implementation • 26 Oct 2022 • Jonas Ricker, Simon Damm, Thorsten Holz, Asja Fischer
However, relatively little attention has been paid to the detection of DM-generated images, which is critical to prevent adverse impacts on our society.
1 code implementation • 22 May 2023 • Kira Maag, Asja Fischer
State-of-the-art deep neural networks have proven to be highly powerful in a broad range of tasks, including semantic image segmentation.
no code implementations • 26 May 2023 • Matías Pizarro, Dorothea Kolossa, Asja Fischer
Adversarial attacks can mislead automatic speech recognition (ASR) systems into predicting an arbitrary target text, thus posing a clear security threat.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +1
no code implementations • 26 May 2023 • Mike Laszkiewicz, Jonas Ricker, Johannes Lederer, Asja Fischer
Recent breakthroughs in generative modeling have sparked interest in practical single-model attribution.
no code implementations • 22 Jun 2023 • Mike Laszkiewicz, Denis Lukovnikov, Johannes Lederer, Asja Fischer
In this work, we propose a set-membership inference attack for generative models using deep image watermarking techniques.
1 code implementation • 13 Jul 2023 • Linara Adilova, Maksym Andriushchenko, Michael Kamp, Asja Fischer, Martin Jaggi
Averaging neural network parameters is an intuitive method for fusing the knowledge of two independent models.
1 code implementation • 26 Oct 2023 • Kira Maag, Asja Fischer
State-of-the-art deep neural networks have been shown to be extremely powerful in a variety of perceptual tasks like semantic segmentation.
1 code implementation • 3 Nov 2023 • Dmytro Velychko, Simon Damm, Asja Fischer, Jörg Lücke
Our main contributions are theoretical, however, and they are twofold: (1) for non-trivial posterior approximations, we provide the (to the knowledge of the authors) first analytical ELBO objective for standard probabilistic sparse coding; and (2) we provide the first demonstration on how a recently shown convergence of the ELBO to entropy sums can be used for learning.
1 code implementation • 10 Dec 2023 • Joel Frank, Franziska Herbert, Jonas Ricker, Lea Schönherr, Thorsten Eisenhofer, Asja Fischer, Markus Dürmuth, Thorsten Holz
To further understand which factors influence people's ability to detect generated media, we include personal variables, chosen based on a literature review in the domains of deepfake and fake news research.
no code implementations • 24 Jan 2024 • Mike Laszkiewicz, Imant Daunhawer, Julia E. Vogt, Asja Fischer, Johannes Lederer
Recent years have witnessed a rapid development of deep generative models for creating synthetic media, such as images and videos.
1 code implementation • 31 Jan 2024 • Jonas Ricker, Denis Lukovnikov, Asja Fischer
A key enabler for generating high-resolution images with low computational cost has been the development of latent diffusion models (LDMs).
no code implementations • 20 Feb 2024 • Denis Lukovnikov, Asja Fischer
While text-to-image diffusion models can generate highquality images from textual descriptions, they generally lack fine-grained control over the visual composition of the generated images.
no code implementations • 28 Feb 2024 • Laura Manduchi, Kushagra Pandey, Robert Bamler, Ryan Cotterell, Sina Däubener, Sophie Fellenz, Asja Fischer, Thomas Gärtner, Matthias Kirchler, Marius Kloft, Yingzhen Li, Christoph Lippert, Gerard de Melo, Eric Nalisnick, Björn Ommer, Rajesh Ranganath, Maja Rudolph, Karen Ullrich, Guy Van Den Broeck, Julia E Vogt, Yixin Wang, Florian Wenzel, Frank Wood, Stephan Mandt, Vincent Fortuin
The field of deep generative modeling has grown rapidly and consistently over the years.
no code implementations • 22 Apr 2024 • Jonas Ricker, Dennis Assenmacher, Thorsten Holz, Asja Fischer, Erwin Quiring
Recent advances in the field of generative artificial intelligence (AI) have blurred the lines between authentic and machine-generated content, making it almost impossible for humans to distinguish between such media.
no code implementations • Findings (EMNLP) 2021 • Denis Lukovnikov, Sina Daubener, Asja Fischer
While neural networks are ubiquitous in state-of-the-art semantic parsers, it has been shown that most standard models suffer from dramatic performance losses when faced with compositionally out-of-distribution (OOD) data.