Search Results for author: Matteo Boschini

Found 11 papers, 8 papers with code

Dark Experience for General Continual Learning: a Strong, Simple Baseline

3 code implementations NeurIPS 2020 Pietro Buzzega, Matteo Boschini, Angelo Porrello, Davide Abati, Simone Calderara

Continual Learning has inspired a plethora of approaches and evaluation settings; however, the majority of them overlooks the properties of a practical scenario, where the data stream cannot be shaped as a sequence of tasks and offline training is not viable.

Class Incremental Learning Knowledge Distillation

Rethinking Experience Replay: a Bag of Tricks for Continual Learning

2 code implementations12 Oct 2020 Pietro Buzzega, Matteo Boschini, Angelo Porrello, Simone Calderara

In Continual Learning, a Neural Network is trained on a stream of data whose distribution shifts over time.

Continual Learning

Continual Semi-Supervised Learning through Contrastive Interpolation Consistency

1 code implementation14 Aug 2021 Matteo Boschini, Pietro Buzzega, Lorenzo Bonicelli, Angelo Porrello, Simone Calderara

This work explores Continual Semi-Supervised Learning (CSSL): here, only a small fraction of labeled input examples are shown to the learner.

Continual Learning Metric Learning

Effects of Auxiliary Knowledge on Continual Learning

1 code implementation3 Jun 2022 Giovanni Bellitto, Matteo Pennisi, Simone Palazzo, Lorenzo Bonicelli, Matteo Boschini, Simone Calderara, Concetto Spampinato

In this paper we propose a new, simple, CL algorithm that focuses on solving the current task in a way that might facilitate the learning of the next ones.

Continual Learning Image Classification

On the Effectiveness of Lipschitz-Driven Rehearsal in Continual Learning

1 code implementation12 Oct 2022 Lorenzo Bonicelli, Matteo Boschini, Angelo Porrello, Concetto Spampinato, Simone Calderara

By means of extensive experiments, we show that applying LiDER delivers a stable performance gain to several state-of-the-art rehearsal CL methods across multiple datasets, both in the presence and absence of pre-training.

Continual Learning

Latent Spectral Regularization for Continual Learning

1 code implementation9 Jan 2023 Emanuele Frascaroli, Riccardo Benaglia, Matteo Boschini, Luca Moschella, Cosimo Fiorini, Emanuele Rodolà, Simone Calderara

While biological intelligence grows organically as new knowledge is gathered throughout life, Artificial Neural Networks forget catastrophically whenever they face a changing training data distribution.

Continual Learning

Semantic Residual Prompts for Continual Learning

no code implementations11 Mar 2024 Martin Menabue, Emanuele Frascaroli, Matteo Boschini, Enver Sangineto, Lorenzo Bonicelli, Angelo Porrello, Simone Calderara

Most of these methods organize these vectors in a pool of key-value pairs, and use the input image as query to retrieve the prompts (values).

Continual Learning

Selective Attention-based Modulation for Continual Learning

no code implementations29 Mar 2024 Giovanni Bellitto, Federica Proietto Salanitri, Matteo Pennisi, Matteo Boschini, Angelo Porrello, Simone Calderara, Simone Palazzo, Concetto Spampinato

We present SAM, a biologically-plausible selective attention-driven modulation approach to enhance classification models in a continual learning setting.

Continual Learning Saliency Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.