no code implementations • 20 Nov 2023 • Eli Verwimp, Rahaf Aljundi, Shai Ben-David, Matthias Bethge, Andrea Cossu, Alexander Gepperth, Tyler L. Hayes, Eyke Hüllermeier, Christopher Kanan, Dhireesha Kudithipudi, Christoph H. Lampert, Martin Mundt, Razvan Pascanu, Adrian Popescu, Andreas S. Tolias, Joost Van de Weijer, Bing Liu, Vincenzo Lomonaco, Tinne Tuytelaars, Gido M. van de Ven
Continual learning is a subfield of machine learning, which aims to allow machine learning models to continuously learn on new data, by accumulating knowledge without forgetting what was learned in the past.
no code implementations • 26 Aug 2023 • Alexander Gepperth
The Mixture of Factor analyzers (MFA) model is an important extension of GMMs, which allows to smoothly interpolate between diagonal and full CMs based on the number of \textit{factor loadings} $l$.
no code implementations • 23 Mar 2023 • Alexander Krawczyk, Alexander Gepperth
In this proof-of-concept study, we propose a replay-based CL strategy that we term adiabatic replay (AR), which derives its efficiency from the (reasonable) assumption that each new learning phase is adiabatic, i. e., represents only a small addition to existing knowledge.
no code implementations • 15 Mar 2023 • Tobias Wagner, Alexander Gepperth, Elmar Engels
The present work contributes to the research of fault detection on rotating machinery in the following terms: (1) Reduction of the human induced bias to the data science process, while still considering expert and task related knowledge, ending in a generic search approach (2) tackling the bearing fault detection task without the need for external sensors (sensorless) (3) learning a domain robust fault detection pipeline applicable to varying motor operating parameters without the need of re-parameterizations or fine-tuning (4) investigations on working condition discrepancies with an excessive degree to determine the pipeline limitations regarding the abstraction of the motor parameters and the pipeline hyperparameters
no code implementations • 30 Aug 2022 • Benedikt Bagus, Alexander Gepperth, Timothée Lesort
Continual Learning (CL, sometimes also termed incremental learning) is a flavor of machine learning where the usual assumption of stationary data distribution is relaxed or omitted.
no code implementations • 8 Jun 2022 • Benedikt Bagus, Alexander Gepperth
We present an empirical study on the use of continual learning (CL) methods in a reinforcement learning (RL) scenario, which, to the best of our knowledge, has not been described before.
no code implementations • 21 Mar 2022 • Alexander Gepperth
We present the Deep Convolutional Gaussian Mixture Model (DCGMM), a new probabilistic approach for image modeling capable of density estimation, sampling and tractable inference.
no code implementations • 15 Aug 2021 • Benedikt Bagus, Alexander Gepperth
We find that the impact of sample selection increases when a smaller number of samples is stored.
no code implementations • 19 Apr 2021 • Alexander Gepperth, Benedikt Pfülb
For generating sharp images with DCGMMs, we introduce a new gradient-based technique for sampling through non-invertible operations like convolution and pooling.
no code implementations • 19 Apr 2021 • Benedikt Pfülb, Alexander Gepperth, Benedikt Bagus
As a concrete realization of generative continual learning, we propose Gaussian Mixture Replay (GMR).
no code implementations • 19 Apr 2021 • Benedikt Pfülb, Alexander Gepperth
In addition, task boundaries can be detected by applying GMM density estimation.
no code implementations • 24 Sep 2020 • Alexander Gepperth, Benedikt Pfülb
This work presents a mathematical treatment of the relation between Self-Organizing Maps (SOMs) and Gaussian Mixture Models (GMMs).
1 code implementation • 18 Dec 2019 • Alexander Gepperth, Benedikt Pfülb
We present an approach for efficiently training Gaussian Mixture Model (GMM) by Stochastic Gradient Descent (SGD) with non-stationary, high-dimensional streaming data.
no code implementations • 25 Sep 2019 • Alexander Gepperth, Benedikt Pfülb
We present an approach for efficiently training Gaussian Mixture Models (GMMs) with Stochastic Gradient Descent (SGD) on large amounts of high-dimensional data (e. g., images).
no code implementations • 10 Sep 2019 • Benedikt Pfülb, Christoph Hardegen, Alexander Gepperth, Sebastian Rieger
We present a study of deep learning applied to the domain of network traffic data forecasting.
no code implementations • 29 Oct 2018 • Timothée Lesort, Alexander Gepperth, Andrei Stoian, David Filliat
We present a new replay-based method of continual classification learning that we term "conditional replay" which generates samples and labels together by sampling from a distribution conditioned on the class.
no code implementations • 6 Jan 2016 • Thomas Kopinski, Stéphane Magand, Uwe Handmann, Alexander Gepperth
We present a novel hierarchical approach to multi-class classification which is generic in that it can be applied to different classification models (e. g., support vector machines, perceptrons), and makes no explicit assumptions about the probabilistic structure of the problem as it is usually done in multi-class classification.
no code implementations • 6 Jan 2016 • Thomas Kopinski, Alexander Gepperth, Uwe Handmann
We present a novel method to perform multi-class pattern classification with neural networks and test it on a challenging 3D hand gesture recognition problem.