1 code implementation • 26 Jun 2023 • Muhammad Anwar Ma'sum, Mahardhika Pratama, Edwin Lughofer, Lin Liu, Habibullah, Ryszard Kowalczyk
This paper proposes a few-shot continual learning approach, termed FLat-tO-WidE AppRoach (FLOWER), where a flat-to-wide learning process finding the flat-wide minima is proposed to address the catastrophic forgetting problem.
1 code implementation • 21 Mar 2023 • Muhammad Anwar Ma'sum, Mahardhika Pratama, Edwin Lughofer, Weiping Ding, Wisnu Jatmiko
This paper proposes an assessor-guided learning strategy for continual learning where an assessor guides the learning process of a base learner by controlling the direction and pace of the learning process thus allowing an efficient learning of new environments while protecting against the catastrophic interference problem.
no code implementations • 29 Mar 2022 • Edwin Lughofer
Furthermore, our approach comes with an online active learning (AL) strategy for updating the classifier on just a number of selected samples, which in turn makes the approach applicable for scarcely labelled streams in applications, where the annotation effort is typically expensive.
no code implementations • 28 Jun 2021 • Mahardhika Pratama, Andri Ashfahani, Edwin Lughofer
Unsupervised continual learning remains a relatively uncharted territory in the existing literature because the vast majority of existing works call for unlimited access of ground truth incurring expensive labelling cost.
no code implementations • 26 Jun 2021 • Andri Ashfahani, Mahardhika Pratama, Edwin Lughofer, Edward Yapp Kien Yee
The common practice of quality monitoring in industry relies on manual inspection well-known to be slow, error-prone and operator-dependent.
no code implementations • 26 Jun 2021 • Mahardhika Pratama, Choiru Za'in, Edwin Lughofer, Eric Pardede, Dwi A. P. Rahayu
The large-scale data stream problem refers to high-speed information flow which cannot be processed in scalable manner under a traditional computing platform.
no code implementations • 8 Oct 2019 • Andri Ashfahani, Mahardhika Pratama, Edwin Lughofer, Yew Soon Ong
The Denoising Autoencoder (DAE) enhances the flexibility of the data stream method in exploiting unlabeled samples.
2 code implementations • 8 Oct 2019 • Mahardhika Pratama, Marcus de Carvalho, Renchunzi Xie, Edwin Lughofer, Jie Lu
It automatically evolves its network structure from scratch with/without the presence of ground truth to overcome independent concept drifts in the source and target domain.
no code implementations • 24 Sep 2018 • Mahardhika Pratama, Andri Ashfahani, Yew Soon Ong, Savitha Ramasamy, Edwin Lughofer
The generative learning phase of Autoencoder (AE) and its successor Denosing Autoencoder (DAE) enhances the flexibility of data stream method in exploiting unlabelled samples.
no code implementations • 20 May 2018 • Andri Ashfahani, Mahardhika Pratama, Edwin Lughofer, Qing Cai, Huang Sheng
{Radio Frequency Identification technology has gained popularity for cheap and easy deployment.
2 code implementations • 16 Nov 2017 • Werner Zellinger, Bernhard A. Moser, Thomas Grubinger, Edwin Lughofer, Thomas Natschläger, Susanne Saminger-Platz
A novel approach for unsupervised domain adaptation for neural networks is proposed.
no code implementations • 6 Nov 2017 • Mahardhika Pratama, Eric Dimla, Edwin Lughofer, Witold Pedrycz, Tegoeh Tjahjowidowo
The paper presents advancement of a newly developed ensemble learning algorithm, pENsemble+, where online active learning scenario is incorporated to reduce operator labelling effort.
no code implementations • 18 May 2017 • Mahardhika Pratama, Witold Pedrycz, Edwin Lughofer
pENsemble adopts a dynamic ensemble structure to output a final classification decision where it features a novel drift detection scenario to grow the ensemble structure.
no code implementations • 6 May 2017 • Mahardhika Pratama, Eric Dimla, Chow Yin Lai, Edwin Lughofer
The learning process consists of three phases: what to learn, how to learn, when to learn and makes use of a generalized recurrent network structure as a cognitive component.
no code implementations • 10 Apr 2017 • Mahardhika Pratama, Plamen P. Angelov, Edwin Lughofer
The theory of random vector functional link network (RVFLN) has provided a breakthrough in the design of neural networks (NNs) since it conveys solid theoretical justification of randomized learning.
1 code implementation • 28 Feb 2017 • Werner Zellinger, Thomas Grubinger, Edwin Lughofer, Thomas Natschläger, Susanne Saminger-Platz
We prove that CMD is a metric on the set of probability distributions on a compact interval.