no code implementations • 11 Sep 2023 • Eden Belouadah, Arnaud Dapogny, Kevin Bailly
The main challenge of class-incremental learning is catastrophic forgetting, the inability of neural networks to retain past knowledge when learning a new one.
no code implementations • 14 Sep 2022 • Grégoire Petit, Adrian Popescu, Eden Belouadah, David Picard, Bertrand Delezoide
Mainstream methods need to store two deep models since they integrate new classes using fine-tuning with knowledge distillation from the previous incremental state.
no code implementations • 1 Feb 2022 • Umang Aggarwal, Adrian Popescu, Eden Belouadah, Céline Hudelot
Since memory is bounded, old classes are learned with fewer images than new classes and an imbalance due to incremental learning is added to the initial dataset imbalance.
1 code implementation • 16 Oct 2021 • Habib Slim, Eden Belouadah, Adrian Popescu, Darian Onchis
We introduce a two-step learning process which allows the transfer of bias correction parameters between reference and target datasets.
4 code implementations • 1 Apr 2021 • Vincenzo Lomonaco, Lorenzo Pellegrini, Andrea Cossu, Antonio Carta, Gabriele Graffieti, Tyler L. Hayes, Matthias De Lange, Marc Masana, Jary Pomponi, Gido van de Ven, Martin Mundt, Qi She, Keiland Cooper, Jeremy Forest, Eden Belouadah, Simone Calderara, German I. Parisi, Fabio Cuzzolin, Andreas Tolias, Simone Scardapane, Luca Antiga, Subutai Amhad, Adrian Popescu, Christopher Kanan, Joost Van de Weijer, Tinne Tuytelaars, Davide Bacciu, Davide Maltoni
Learning continually from non-stationary data streams is a long-standing goal and a challenging problem in machine learning.
1 code implementation • 3 Nov 2020 • Eden Belouadah, Adrian Popescu, Ioannis Kanellos
A second type of approaches fix the deep model size and introduce a mechanism whose objective is to ensure a good compromise between stability and plasticity of the model.
1 code implementation • 31 Aug 2020 • Eden Belouadah, Adrian Popescu, Ioannis Kanellos
It leverages initial classifier weights which provide a strong representation of past classes because they are trained with all class data.
1 code implementation • 25 Aug 2020 • Eden Belouadah, Adrian Popescu, Umang Aggarwal, Léo Saci
Most existing algorithms make two strong hypotheses which reduce the realism of the incremental scenario: (1) new data are assumed to be readily annotated when streamed and (2) tests are run with balanced datasets while most real-life datasets are actually imbalanced.
1 code implementation • 16 Jan 2020 • Eden Belouadah, Adrian Popescu
The problem is non trivial if the agent runs on a limited computational budget and has a bounded memory of past data.
2 code implementations • ICCV 2019 • Eden Belouadah, Adrian Popescu
This paper presents a class incremental learning (IL) method which exploits fine tuning and a dual memory to reduce the negative effect of catastrophic forgetting in image recognition.
no code implementations • 20 Aug 2018 • Eden Belouadah, Adrian Popescu
Incremental Learning (IL) is an interesting AI problem when the algorithm is assumed to work on a budget.