no code implementations • 29 Mar 2024 • Giovanni Bellitto, Federica Proietto Salanitri, Matteo Pennisi, Matteo Boschini, Angelo Porrello, Simone Calderara, Simone Palazzo, Concetto Spampinato
We present SAM, a biologically-plausible selective attention-driven modulation approach to enhance classification models in a continual learning setting.
no code implementations • 11 Mar 2024 • Martin Menabue, Emanuele Frascaroli, Matteo Boschini, Enver Sangineto, Lorenzo Bonicelli, Angelo Porrello, Simone Calderara
Most of these methods organize these vectors in a pool of key-value pairs, and use the input image as query to retrieve the prompts (values).
no code implementations • 5 May 2023 • Lorenzo Bonicelli, Matteo Boschini, Emanuele Frascaroli, Angelo Porrello, Matteo Pennisi, Giovanni Bellitto, Simone Palazzo, Concetto Spampinato, Simone Calderara
Humans can learn incrementally, whereas neural networks forget previously acquired information catastrophically.
1 code implementation • 9 Jan 2023 • Emanuele Frascaroli, Riccardo Benaglia, Matteo Boschini, Luca Moschella, Cosimo Fiorini, Emanuele Rodolà, Simone Calderara
While biological intelligence grows organically as new knowledge is gathered throughout life, Artificial Neural Networks forget catastrophically whenever they face a changing training data distribution.
1 code implementation • 12 Oct 2022 • Lorenzo Bonicelli, Matteo Boschini, Angelo Porrello, Concetto Spampinato, Simone Calderara
By means of extensive experiments, we show that applying LiDER delivers a stable performance gain to several state-of-the-art rehearsal CL methods across multiple datasets, both in the presence and absence of pre-training.
1 code implementation • 3 Jun 2022 • Giovanni Bellitto, Matteo Pennisi, Simone Palazzo, Lorenzo Bonicelli, Matteo Boschini, Simone Calderara, Concetto Spampinato
In this paper we propose a new, simple, CL algorithm that focuses on solving the current task in a way that might facilitate the learning of the next ones.
1 code implementation • 1 Jun 2022 • Matteo Boschini, Lorenzo Bonicelli, Angelo Porrello, Giovanni Bellitto, Matteo Pennisi, Simone Palazzo, Concetto Spampinato, Simone Calderara
This work investigates the entanglement between Continual Learning (CL) and Transfer Learning (TL).
1 code implementation • 3 Jan 2022 • Matteo Boschini, Lorenzo Bonicelli, Pietro Buzzega, Angelo Porrello, Simone Calderara
The staple of human intelligence is the capability of acquiring knowledge in a continuous fashion.
1 code implementation • 14 Aug 2021 • Matteo Boschini, Pietro Buzzega, Lorenzo Bonicelli, Angelo Porrello, Simone Calderara
This work explores Continual Semi-Supervised Learning (CSSL): here, only a small fraction of labeled input examples are shown to the learner.
2 code implementations • 12 Oct 2020 • Pietro Buzzega, Matteo Boschini, Angelo Porrello, Simone Calderara
In Continual Learning, a Neural Network is trained on a stream of data whose distribution shifts over time.
3 code implementations • NeurIPS 2020 • Pietro Buzzega, Matteo Boschini, Angelo Porrello, Davide Abati, Simone Calderara
Continual Learning has inspired a plethora of approaches and evaluation settings; however, the majority of them overlooks the properties of a practical scenario, where the data stream cannot be shaped as a sequence of tasks and offline training is not viable.
Ranked #12 on Continual Learning on ASC (19 tasks)