1 code implementation • 2 Feb 2023 • Antonio Carta, Lorenzo Pellegrini, Andrea Cossu, Hamed Hemati, Vincenzo Lomonaco
Continual learning is the problem of learning from a nonstationary stream of data, a fundamental issue for sustainable and efficient training of deep neural networks over time.
no code implementations • 26 Jan 2023 • Hamed Hemati, Andrea Cossu, Antonio Carta, Julio Hurtado, Lorenzo Pellegrini, Davide Bacciu, Vincenzo Lomonaco, Damian Borth
We focus on the family of Class-Incremental with Repetition (CIR) scenarios, where repetition is embedded in the definition of the stream.
1 code implementation • 29 Jun 2022 • Federico Matteoni, Andrea Cossu, Claudio Gallicchio, Vincenzo Lomonaco, Davide Bacciu
Continual Learning (CL) on time series data represents a promising but under-studied avenue for real-world applications.
1 code implementation • 23 Jun 2022 • Mattia Sangermano, Antonio Carta, Andrea Cossu, Davide Bacciu
A popular solution in these scenario is to use a small memory to retain old data and rehearse them over time.
1 code implementation • 19 May 2022 • Andrea Cossu, Tinne Tuytelaars, Antonio Carta, Lucia Passaro, Vincenzo Lomonaco, Davide Bacciu
We formalize and investigate the characteristics of the continual pre-training scenario in both language and vision environments, where a model is continually pre-trained on a stream of incoming data and only later fine-tuned to different downstream tasks.
no code implementations • 19 Mar 2022 • Gabriele Merlin, Vincenzo Lomonaco, Andrea Cossu, Antonio Carta, Davide Bacciu
Continual Learning requires the model to learn from a stream of dynamic, non-stationary data without forgetting previous knowledge.
1 code implementation • 13 Dec 2021 • Antonio Carta, Andrea Cossu, Vincenzo Lomonaco, Davide Bacciu
Learning continually from non-stationary data streams is a challenging research topic of growing popularity in the last few years.
no code implementations • 6 Dec 2021 • Andrea Cossu, Gabriele Graffieti, Lorenzo Pellegrini, Davide Maltoni, Davide Bacciu, Antonio Carta, Vincenzo Lomonaco
The ability of a model to learn continually can be empirically assessed in different continual learning scenarios.
no code implementations • 17 Nov 2021 • Andrea Cossu, Marta Ziosi, Vincenzo Lomonaco
The increasing attention on Artificial Intelligence (AI) regulation has led to the definition of a set of ethical principles grouped into the Sustainable AI framework.
1 code implementation • 17 May 2021 • Andrea Cossu, Davide Bacciu, Antonio Carta, Claudio Gallicchio, Vincenzo Lomonaco
Continual Learning (CL) refers to a learning setup where data is non stationary and the model has to learn without forgetting existing knowledge.
5 code implementations • 1 Apr 2021 • Vincenzo Lomonaco, Lorenzo Pellegrini, Andrea Cossu, Antonio Carta, Gabriele Graffieti, Tyler L. Hayes, Matthias De Lange, Marc Masana, Jary Pomponi, Gido van de Ven, Martin Mundt, Qi She, Keiland Cooper, Jeremy Forest, Eden Belouadah, Simone Calderara, German I. Parisi, Fabio Cuzzolin, Andreas Tolias, Simone Scardapane, Luca Antiga, Subutai Amhad, Adrian Popescu, Christopher Kanan, Joost Van de Weijer, Tinne Tuytelaars, Davide Bacciu, Davide Maltoni
Learning continually from non-stationary data streams is a long-standing goal and a challenging problem in machine learning.
2 code implementations • 29 Mar 2021 • Andrea Rosasco, Antonio Carta, Andrea Cossu, Vincenzo Lomonaco, Davide Bacciu
Replay strategies are Continual Learning techniques which mitigate catastrophic forgetting by keeping a buffer of patterns from previous experiences, which are interleaved with new data during training.
1 code implementation • 22 Mar 2021 • Antonio Carta, Andrea Cossu, Federico Errica, Davide Bacciu
In this work, we study the phenomenon of catastrophic forgetting in the graph representation learning scenario.
no code implementations • 12 Mar 2021 • Andrea Cossu, Antonio Carta, Vincenzo Lomonaco, Davide Bacciu
We propose two new benchmarks for CL with sequential data based on existing datasets, whose characteristics resemble real-world applications.
1 code implementation • 8 Apr 2020 • Andrea Cossu, Antonio Carta, Davide Bacciu
The ability to learn in dynamic, nonstationary environments without forgetting previous knowledge, also known as Continual Learning (CL), is a key enabler for scalable and trustworthy deployments of adaptive solutions.