no code implementations • 12 Aug 2022 • Robert A. Marsden, Felix Wiewel, Mario Döbler, Yang Yang, Bin Yang
In this work, we focus on UDA and additionally address the case of adapting not only to a single domain, but to a sequence of target domains.
1 code implementation • 18 May 2022 • Alexander Bartler, Florian Bender, Felix Wiewel, Bin Yang
Nowadays, deep neural networks outperform humans in many tasks.
no code implementations • 6 Sep 2021 • Nico Reick, Felix Wiewel, Alexander Bartler, Bin Yang
One major drawback of state-of-the-art artificial intelligence is its lack of explainability.
1 code implementation • 30 Mar 2021 • Alexander Bartler, Andre Bühler, Felix Wiewel, Mario Döbler, Bin Yang
By minimizing the self-supervised loss, we learn task-specific model parameters for different tasks.
1 code implementation • 19 Feb 2021 • Felix Wiewel, Bin Yang
While many recently proposed methods for continual learning use some training examples for rehearsal, their performance strongly depends on the number of stored examples.
no code implementations • 6 Jun 2019 • Felix Wiewel, Bin Yang
Artificial neural networks (ANNs) suffer from catastrophic forgetting when trained on a sequence of tasks.
no code implementations • ICLR 2019 • Alexander Bartler, Felix Wiewel, Bin Yang, Lukas Mauch
In this paper, we propose an easy method to train VAEs with binary or categorically valued latent representations.