no code implementations • 21 Feb 2024 • Lukas Gruber, Markus Holzleitner, Johannes Lehner, Sepp Hochreiter, Werner Zellinger
Estimating the ratio of two probability densities from finitely many samples, is a central task in machine learning and statistics.
2 code implementations • 1 Feb 2024 • Marius-Constantin Dinu, Claudiu Leoveanu-Condrei, Markus Holzleitner, Werner Zellinger, Sepp Hochreiter
We conclude by introducing a quality measure and its empirical score for evaluating these computational graphs, and propose a benchmark that compares various state-of-the-art LLMs across a set of complex workflows.
no code implementations • 15 Aug 2023 • Duc Hoan Nguyen, Werner Zellinger, Sergei V. Pereverzyev
We discuss the problem of estimating Radon-Nikodym derivatives.
no code implementations • 30 Jul 2023 • Werner Zellinger, Stefan Kindermann, Sergei V. Pereverzyev
Estimating the ratio of two probability densities from finitely many observations of the densities is a central problem in machine learning and statistics with applications in two-sample testing, divergence estimation, generative modeling, covariate shift adaptation, conditional density estimation, and novelty detection.
no code implementations • 21 Jul 2023 • Duc Hoan Nguyen, Sergei V. Pereverzyev, Werner Zellinger
Sample reweighting is one of the most widely used methods for correcting the error of least squares learning algorithms in reproducing kernel Hilbert spaces (RKHS), that is caused by future data distributions that are different from the training data distribution.
1 code implementation • 2 May 2023 • Marius-Constantin Dinu, Markus Holzleitner, Maximilian Beck, Hoan Duc Nguyen, Andrea Huber, Hamid Eghbal-zadeh, Bernhard A. Moser, Sergei Pereverzyev, Sepp Hochreiter, Werner Zellinger
Our method outperforms deep embedded validation (DEV) and importance weighted validation (IWV) on all datasets, setting a new state-of-the-art performance for solving parameter choice issues in unsupervised domain adaptation with theoretical error guarantees.
1 code implementation • 9 Feb 2023 • Markus Holzleitner, Sergei V. Pereverzyev, Werner Zellinger
The problem of domain generalization is to learn, given data from different source distributions, a model that can be expected to generalize well on new target distributions which are only seen through unlabeled samples.
1 code implementation • 7 Jun 2022 • Martin Gauch, Maximilian Beck, Thomas Adler, Dmytro Kotsur, Stefan Fiel, Hamid Eghbal-zadeh, Johannes Brandstetter, Johannes Kofler, Markus Holzleitner, Werner Zellinger, Daniel Klotz, Sepp Hochreiter, Sebastian Lehner
We introduce SubGD, a novel few-shot learning method which is based on the recent finding that stochastic gradient descent updates tend to live in a low-dimensional parameter subspace.
no code implementations • 4 May 2022 • Antonio Emanuele Cinà, Kathrin Grosse, Ambra Demontis, Sebastiano Vascon, Werner Zellinger, Bernhard A. Moser, Alina Oprea, Battista Biggio, Marcello Pelillo, Fabio Roli
In this survey, we provide a comprehensive systematization of poisoning attacks and defenses in machine learning, reviewing more than 100 papers published in the field in the last 15 years.
1 code implementation • NeurIPS 2021 • Werner Zellinger, Natalia Shepeleva, Marius-Constantin Dinu, Hamid Eghbal-zadeh, Hoan Nguyen, Bernhard Nessler, Sergei Pereverzyev, Bernhard A. Moser
Our approach starts with the observation that the widely-used method of minimizing the source error, penalized by a distance measure between source and target feature representations, shares characteristics with regularized ill-posed inverse problems.
no code implementations • 6 Jul 2020 • Hamid Eghbal-zadeh, Khaled Koutini, Paul Primus, Verena Haunschmid, Michal Lewandowski, Werner Zellinger, Bernhard A. Moser, Gerhard Widmer
Data augmentation techniques have become standard practice in deep learning, as it has been shown to greatly improve the generalisation abilities of models.
1 code implementation • 20 May 2020 • Natalia Shepeleva, Werner Zellinger, Michal Lewandowski, Bernhard Moser
We propose a new metric space of ReLU activation codes equipped with a truncated Hamming distance which establishes an isometry between its elements and polyhedral bodies in the input space which have recently been shown to be strongly related to safety, robustness, and confidence.
no code implementations • 22 Apr 2020 • Werner Zellinger
This thesis contributes to the mathematical foundation of domain adaptation as emerging field in machine learning.
no code implementations • 19 Feb 2020 • Werner Zellinger, Bernhard A. Moser, Susanne Saminger-Platz
Domain adaptation algorithms are designed to minimize the misclassification risk of a discriminative model for a target domain with little training data by adapting a model from a source domain with a large amount of training data.
1 code implementation • CVPR 2019 • Hamid Eghbal-zadeh, Werner Zellinger, Gerhard Widmer
Generative Adversarial Networks have surprising ability for generating sharp and realistic images, though they are known to suffer from the so-called mode collapse problem.
2 code implementations • 16 Nov 2017 • Werner Zellinger, Bernhard A. Moser, Thomas Grubinger, Edwin Lughofer, Thomas Natschläger, Susanne Saminger-Platz
A novel approach for unsupervised domain adaptation for neural networks is proposed.
1 code implementation • 28 Feb 2017 • Werner Zellinger, Thomas Grubinger, Edwin Lughofer, Thomas Natschläger, Susanne Saminger-Platz
We prove that CMD is a metric on the set of probability distributions on a compact interval.