no code implementations • 4 Apr 2024 • Quentin Jodelet, Xin Liu, Yin Jun Phua, Tsuyoshi Murata
Exemplar-Free Class Incremental Learning is a highly challenging setting where replay memory is unavailable.
no code implementations • 30 Jun 2023 • Quentin Jodelet, Xin Liu, Yin Jun Phua, Tsuyoshi Murata
Experiments on the competitive benchmarks CIFAR100, ImageNet-Subset, and ImageNet demonstrate how this new approach can be used to further improve the performance of state-of-the-art methods for class-incremental learning on large scale datasets.
no code implementations • journal 2021 • Zarina Rakhimberdina, Quentin Jodelet, Xin Liu, Tsuyoshi Murata
With the advent of brain imaging techniques and machine learning tools, much effort has been devoted to building computational models to capture the encoding of visual information in the human brain.
no code implementations • 23 Mar 2021 • Quentin Jodelet, Xin Liu, Tsuyoshi Murata
When incrementally trained on new classes, deep neural networks are subject to catastrophic forgetting which leads to an extreme deterioration of their performance on the old classes while learning the new ones.
1 code implementation • 14 Sep 2020 • Vincenzo Lomonaco, Lorenzo Pellegrini, Pau Rodriguez, Massimo Caccia, Qi She, Yu Chen, Quentin Jodelet, Ruiping Wang, Zheda Mai, David Vazquez, German I. Parisi, Nikhil Churamani, Marc Pickett, Issam Laradji, Davide Maltoni
In the last few years, we have witnessed a renewed and fast-growing interest in continual learning with deep neural networks with the shared objective of making current AI systems more adaptive, efficient and autonomous.
no code implementations • 4 Apr 2019 • Quentin Jodelet, Vincent Gripon, Masafumi Hagiwara
In this paper, we introduce a novel layer designed to be used as the output of pre-trained neural networks in the context of classification.